25675 1727203980.10747: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 25675 1727203980.11164: Added group all to inventory 25675 1727203980.11166: Added group ungrouped to inventory 25675 1727203980.11170: Group all now contains ungrouped 25675 1727203980.11173: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 25675 1727203980.28217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 25675 1727203980.28287: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 25675 1727203980.28310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 25675 1727203980.28382: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 25675 1727203980.28454: Loaded config def from plugin (inventory/script) 25675 1727203980.28457: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 25675 1727203980.28499: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 25675 1727203980.28588: Loaded config def from plugin (inventory/yaml) 25675 1727203980.28590: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 25675 1727203980.28674: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 25675 1727203980.29101: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 25675 1727203980.29105: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 25675 1727203980.29108: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 25675 1727203980.29114: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 25675 1727203980.29118: Loading data from /tmp/network-zt6/inventory-rSl.yml 25675 1727203980.29186: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 25675 1727203980.29249: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 25675 1727203980.29290: Loading data from /tmp/network-zt6/inventory-rSl.yml 25675 1727203980.29372: group all already in inventory 25675 1727203980.29381: set inventory_file for managed-node1 25675 1727203980.29385: set inventory_dir for managed-node1 25675 1727203980.29386: Added host managed-node1 to inventory 25675 1727203980.29388: Added host managed-node1 to group all 25675 1727203980.29389: set ansible_host for managed-node1 25675 1727203980.29390: set ansible_ssh_extra_args for managed-node1 25675 1727203980.29393: set inventory_file for managed-node2 25675 1727203980.29396: set inventory_dir for managed-node2 25675 1727203980.29396: Added host managed-node2 to inventory 25675 1727203980.29398: Added host managed-node2 to group all 25675 1727203980.29399: set ansible_host for managed-node2 25675 1727203980.29399: set ansible_ssh_extra_args for managed-node2 25675 1727203980.29402: set inventory_file for managed-node3 25675 1727203980.29404: set inventory_dir for managed-node3 25675 1727203980.29405: Added host managed-node3 to inventory 25675 1727203980.29406: Added host managed-node3 to group all 25675 1727203980.29407: set ansible_host for managed-node3 25675 1727203980.29407: set ansible_ssh_extra_args for managed-node3 25675 1727203980.29410: Reconcile groups and hosts in inventory. 25675 1727203980.29413: Group ungrouped now contains managed-node1 25675 1727203980.29415: Group ungrouped now contains managed-node2 25675 1727203980.29417: Group ungrouped now contains managed-node3 25675 1727203980.29492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 25675 1727203980.29613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 25675 1727203980.29659: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 25675 1727203980.29689: Loaded config def from plugin (vars/host_group_vars) 25675 1727203980.29691: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 25675 1727203980.29698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 25675 1727203980.29706: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 25675 1727203980.29746: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 25675 1727203980.30064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203980.30177: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 25675 1727203980.30224: Loaded config def from plugin (connection/local) 25675 1727203980.30228: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 25675 1727203980.31308: Loaded config def from plugin (connection/paramiko_ssh) 25675 1727203980.31311: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 25675 1727203980.32435: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25675 1727203980.32478: Loaded config def from plugin (connection/psrp) 25675 1727203980.32482: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 25675 1727203980.33207: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25675 1727203980.33246: Loaded config def from plugin (connection/ssh) 25675 1727203980.33249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 25675 1727203980.35705: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25675 1727203980.35745: Loaded config def from plugin (connection/winrm) 25675 1727203980.35748: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 25675 1727203980.35988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 25675 1727203980.36058: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 25675 1727203980.36134: Loaded config def from plugin (shell/cmd) 25675 1727203980.36136: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 25675 1727203980.36164: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 25675 1727203980.36389: Loaded config def from plugin (shell/powershell) 25675 1727203980.36392: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 25675 1727203980.36454: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 25675 1727203980.36643: Loaded config def from plugin (shell/sh) 25675 1727203980.36645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 25675 1727203980.36691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 25675 1727203980.36818: Loaded config def from plugin (become/runas) 25675 1727203980.36820: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 25675 1727203980.37010: Loaded config def from plugin (become/su) 25675 1727203980.37013: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 25675 1727203980.37178: Loaded config def from plugin (become/sudo) 25675 1727203980.37180: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 25675 1727203980.37213: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 25675 1727203980.37540: in VariableManager get_vars() 25675 1727203980.37561: done with get_vars() 25675 1727203980.37701: trying /usr/local/lib/python3.12/site-packages/ansible/modules 25675 1727203980.41442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 25675 1727203980.41552: in VariableManager get_vars() 25675 1727203980.41557: done with get_vars() 25675 1727203980.41559: variable 'playbook_dir' from source: magic vars 25675 1727203980.41560: variable 'ansible_playbook_python' from source: magic vars 25675 1727203980.41561: variable 'ansible_config_file' from source: magic vars 25675 1727203980.41561: variable 'groups' from source: magic vars 25675 1727203980.41562: variable 'omit' from source: magic vars 25675 1727203980.41563: variable 'ansible_version' from source: magic vars 25675 1727203980.41563: variable 'ansible_check_mode' from source: magic vars 25675 1727203980.41564: variable 'ansible_diff_mode' from source: magic vars 25675 1727203980.41565: variable 'ansible_forks' from source: magic vars 25675 1727203980.41565: variable 'ansible_inventory_sources' from source: magic vars 25675 1727203980.41566: variable 'ansible_skip_tags' from source: magic vars 25675 1727203980.41567: variable 'ansible_limit' from source: magic vars 25675 1727203980.41567: variable 'ansible_run_tags' from source: magic vars 25675 1727203980.41568: variable 'ansible_verbosity' from source: magic vars 25675 1727203980.41602: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 25675 1727203980.42280: in VariableManager get_vars() 25675 1727203980.42297: done with get_vars() 25675 1727203980.42336: in VariableManager get_vars() 25675 1727203980.42358: done with get_vars() 25675 1727203980.42395: in VariableManager get_vars() 25675 1727203980.42409: done with get_vars() 25675 1727203980.42439: in VariableManager get_vars() 25675 1727203980.42451: done with get_vars() 25675 1727203980.42527: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25675 1727203980.42716: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25675 1727203980.42862: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25675 1727203980.43518: in VariableManager get_vars() 25675 1727203980.43541: done with get_vars() 25675 1727203980.44017: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 25675 1727203980.44146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727203980.46022: in VariableManager get_vars() 25675 1727203980.46042: done with get_vars() 25675 1727203980.46169: in VariableManager get_vars() 25675 1727203980.46173: done with get_vars() 25675 1727203980.46177: variable 'playbook_dir' from source: magic vars 25675 1727203980.46178: variable 'ansible_playbook_python' from source: magic vars 25675 1727203980.46179: variable 'ansible_config_file' from source: magic vars 25675 1727203980.46180: variable 'groups' from source: magic vars 25675 1727203980.46180: variable 'omit' from source: magic vars 25675 1727203980.46181: variable 'ansible_version' from source: magic vars 25675 1727203980.46182: variable 'ansible_check_mode' from source: magic vars 25675 1727203980.46183: variable 'ansible_diff_mode' from source: magic vars 25675 1727203980.46183: variable 'ansible_forks' from source: magic vars 25675 1727203980.46184: variable 'ansible_inventory_sources' from source: magic vars 25675 1727203980.46185: variable 'ansible_skip_tags' from source: magic vars 25675 1727203980.46186: variable 'ansible_limit' from source: magic vars 25675 1727203980.46186: variable 'ansible_run_tags' from source: magic vars 25675 1727203980.46187: variable 'ansible_verbosity' from source: magic vars 25675 1727203980.46220: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 25675 1727203980.46294: in VariableManager get_vars() 25675 1727203980.46298: done with get_vars() 25675 1727203980.46300: variable 'playbook_dir' from source: magic vars 25675 1727203980.46301: variable 'ansible_playbook_python' from source: magic vars 25675 1727203980.46301: variable 'ansible_config_file' from source: magic vars 25675 1727203980.46302: variable 'groups' from source: magic vars 25675 1727203980.46303: variable 'omit' from source: magic vars 25675 1727203980.46304: variable 'ansible_version' from source: magic vars 25675 1727203980.46304: variable 'ansible_check_mode' from source: magic vars 25675 1727203980.46305: variable 'ansible_diff_mode' from source: magic vars 25675 1727203980.46306: variable 'ansible_forks' from source: magic vars 25675 1727203980.46306: variable 'ansible_inventory_sources' from source: magic vars 25675 1727203980.46307: variable 'ansible_skip_tags' from source: magic vars 25675 1727203980.46308: variable 'ansible_limit' from source: magic vars 25675 1727203980.46309: variable 'ansible_run_tags' from source: magic vars 25675 1727203980.46309: variable 'ansible_verbosity' from source: magic vars 25675 1727203980.46339: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 25675 1727203980.46421: in VariableManager get_vars() 25675 1727203980.46433: done with get_vars() 25675 1727203980.46474: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25675 1727203980.46584: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25675 1727203980.46661: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25675 1727203980.47134: in VariableManager get_vars() 25675 1727203980.47154: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727203980.48607: in VariableManager get_vars() 25675 1727203980.48627: done with get_vars() 25675 1727203980.48665: in VariableManager get_vars() 25675 1727203980.48668: done with get_vars() 25675 1727203980.48670: variable 'playbook_dir' from source: magic vars 25675 1727203980.48670: variable 'ansible_playbook_python' from source: magic vars 25675 1727203980.48671: variable 'ansible_config_file' from source: magic vars 25675 1727203980.48672: variable 'groups' from source: magic vars 25675 1727203980.48673: variable 'omit' from source: magic vars 25675 1727203980.48673: variable 'ansible_version' from source: magic vars 25675 1727203980.48674: variable 'ansible_check_mode' from source: magic vars 25675 1727203980.48676: variable 'ansible_diff_mode' from source: magic vars 25675 1727203980.48677: variable 'ansible_forks' from source: magic vars 25675 1727203980.48678: variable 'ansible_inventory_sources' from source: magic vars 25675 1727203980.48678: variable 'ansible_skip_tags' from source: magic vars 25675 1727203980.48679: variable 'ansible_limit' from source: magic vars 25675 1727203980.48680: variable 'ansible_run_tags' from source: magic vars 25675 1727203980.48681: variable 'ansible_verbosity' from source: magic vars 25675 1727203980.48712: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 25675 1727203980.48785: in VariableManager get_vars() 25675 1727203980.48796: done with get_vars() 25675 1727203980.48835: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25675 1727203980.50620: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25675 1727203980.50702: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25675 1727203980.51086: in VariableManager get_vars() 25675 1727203980.51107: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727203980.52617: in VariableManager get_vars() 25675 1727203980.52634: done with get_vars() 25675 1727203980.52670: in VariableManager get_vars() 25675 1727203980.52685: done with get_vars() 25675 1727203980.52747: in VariableManager get_vars() 25675 1727203980.52759: done with get_vars() 25675 1727203980.52855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 25675 1727203980.52869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 25675 1727203980.53105: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 25675 1727203980.53257: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 25675 1727203980.53260: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 25675 1727203980.53290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 25675 1727203980.53312: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 25675 1727203980.53455: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 25675 1727203980.53513: Loaded config def from plugin (callback/default) 25675 1727203980.53515: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25675 1727203980.54611: Loaded config def from plugin (callback/junit) 25675 1727203980.54614: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25675 1727203980.54658: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 25675 1727203980.54727: Loaded config def from plugin (callback/minimal) 25675 1727203980.54730: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25675 1727203980.54770: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25675 1727203980.54832: Loaded config def from plugin (callback/tree) 25675 1727203980.54835: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 25675 1727203980.54962: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 25675 1727203980.54964: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 25675 1727203980.54992: in VariableManager get_vars() 25675 1727203980.55005: done with get_vars() 25675 1727203980.55011: in VariableManager get_vars() 25675 1727203980.55018: done with get_vars() 25675 1727203980.55023: variable 'omit' from source: magic vars 25675 1727203980.55061: in VariableManager get_vars() 25675 1727203980.55077: done with get_vars() 25675 1727203980.55098: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 25675 1727203980.55613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 25675 1727203980.55683: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 25675 1727203980.55718: getting the remaining hosts for this loop 25675 1727203980.55720: done getting the remaining hosts for this loop 25675 1727203980.55723: getting the next task for host managed-node2 25675 1727203980.55726: done getting next task for host managed-node2 25675 1727203980.55728: ^ task is: TASK: Gathering Facts 25675 1727203980.55729: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203980.55736: getting variables 25675 1727203980.55738: in VariableManager get_vars() 25675 1727203980.55747: Calling all_inventory to load vars for managed-node2 25675 1727203980.55749: Calling groups_inventory to load vars for managed-node2 25675 1727203980.55752: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203980.55763: Calling all_plugins_play to load vars for managed-node2 25675 1727203980.55778: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203980.55782: Calling groups_plugins_play to load vars for managed-node2 25675 1727203980.55819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203980.55878: done with get_vars() 25675 1727203980.55885: done getting variables 25675 1727203980.55950: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Tuesday 24 September 2024 14:53:00 -0400 (0:00:00.010) 0:00:00.010 ***** 25675 1727203980.55973: entering _queue_task() for managed-node2/gather_facts 25675 1727203980.55974: Creating lock for gather_facts 25675 1727203980.56355: worker is 1 (out of 1 available) 25675 1727203980.56365: exiting _queue_task() for managed-node2/gather_facts 25675 1727203980.56384: done queuing things up, now waiting for results queue to drain 25675 1727203980.56386: waiting for pending results... 25675 1727203980.56796: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727203980.56804: in run() - task 028d2410-947f-41bd-b19d-00000000007c 25675 1727203980.56808: variable 'ansible_search_path' from source: unknown 25675 1727203980.56816: calling self._execute() 25675 1727203980.56819: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203980.56826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203980.56839: variable 'omit' from source: magic vars 25675 1727203980.56937: variable 'omit' from source: magic vars 25675 1727203980.56967: variable 'omit' from source: magic vars 25675 1727203980.57007: variable 'omit' from source: magic vars 25675 1727203980.57050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203980.57090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203980.57112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203980.57132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203980.57147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203980.57180: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203980.57188: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203980.57195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203980.57345: Set connection var ansible_shell_type to sh 25675 1727203980.57357: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203980.57372: Set connection var ansible_timeout to 10 25675 1727203980.57385: Set connection var ansible_pipelining to False 25675 1727203980.57395: Set connection var ansible_shell_executable to /bin/sh 25675 1727203980.57401: Set connection var ansible_connection to ssh 25675 1727203980.57433: variable 'ansible_shell_executable' from source: unknown 25675 1727203980.57441: variable 'ansible_connection' from source: unknown 25675 1727203980.57449: variable 'ansible_module_compression' from source: unknown 25675 1727203980.57455: variable 'ansible_shell_type' from source: unknown 25675 1727203980.57463: variable 'ansible_shell_executable' from source: unknown 25675 1727203980.57478: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203980.57487: variable 'ansible_pipelining' from source: unknown 25675 1727203980.57494: variable 'ansible_timeout' from source: unknown 25675 1727203980.57501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203980.57691: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203980.57706: variable 'omit' from source: magic vars 25675 1727203980.57799: starting attempt loop 25675 1727203980.57802: running the handler 25675 1727203980.57805: variable 'ansible_facts' from source: unknown 25675 1727203980.57808: _low_level_execute_command(): starting 25675 1727203980.57810: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203980.58507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203980.58572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203980.58641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203980.58657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203980.58785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203980.60473: stdout chunk (state=3): >>>/root <<< 25675 1727203980.60578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203980.60604: stderr chunk (state=3): >>><<< 25675 1727203980.60607: stdout chunk (state=3): >>><<< 25675 1727203980.60627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203980.60638: _low_level_execute_command(): starting 25675 1727203980.60652: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770 `" && echo ansible-tmp-1727203980.606273-25692-96108671387770="` echo /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770 `" ) && sleep 0' 25675 1727203980.61081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203980.61086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203980.61088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203980.61091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203980.61102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203980.61143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203980.61150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203980.61152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203980.61225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203980.63145: stdout chunk (state=3): >>>ansible-tmp-1727203980.606273-25692-96108671387770=/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770 <<< 25675 1727203980.63282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203980.63287: stdout chunk (state=3): >>><<< 25675 1727203980.63289: stderr chunk (state=3): >>><<< 25675 1727203980.63300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203980.606273-25692-96108671387770=/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203980.63325: variable 'ansible_module_compression' from source: unknown 25675 1727203980.63364: ANSIBALLZ: Using generic lock for ansible.legacy.setup 25675 1727203980.63370: ANSIBALLZ: Acquiring lock 25675 1727203980.63373: ANSIBALLZ: Lock acquired: 139822507557424 25675 1727203980.63377: ANSIBALLZ: Creating module 25675 1727203981.03969: ANSIBALLZ: Writing module into payload 25675 1727203981.04359: ANSIBALLZ: Writing module 25675 1727203981.04364: ANSIBALLZ: Renaming module 25675 1727203981.04367: ANSIBALLZ: Done creating module 25675 1727203981.04371: variable 'ansible_facts' from source: unknown 25675 1727203981.04373: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203981.04378: _low_level_execute_command(): starting 25675 1727203981.04457: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 25675 1727203981.05086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203981.05100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203981.05115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203981.05132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203981.05148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203981.05163: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203981.05178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.05197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203981.05208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203981.05274: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.05306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203981.05324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203981.05347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203981.05605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203981.07361: stdout chunk (state=3): >>>PLATFORM <<< 25675 1727203981.07448: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 <<< 25675 1727203981.07463: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 25675 1727203981.07649: stdout chunk (state=3): >>><<< 25675 1727203981.07652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203981.07655: stderr chunk (state=3): >>><<< 25675 1727203981.07677: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203981.07695 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 25675 1727203981.07933: _low_level_execute_command(): starting 25675 1727203981.07936: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 25675 1727203981.08025: Sending initial data 25675 1727203981.08035: Sent initial data (1181 bytes) 25675 1727203981.08845: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203981.08913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.09215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203981.09694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203981.13059: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 25675 1727203981.13533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203981.13559: stderr chunk (state=3): >>><<< 25675 1727203981.13572: stdout chunk (state=3): >>><<< 25675 1727203981.13599: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203981.13896: variable 'ansible_facts' from source: unknown 25675 1727203981.13906: variable 'ansible_facts' from source: unknown 25675 1727203981.13920: variable 'ansible_module_compression' from source: unknown 25675 1727203981.14013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727203981.14048: variable 'ansible_facts' from source: unknown 25675 1727203981.14364: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py 25675 1727203981.14872: Sending initial data 25675 1727203981.14886: Sent initial data (152 bytes) 25675 1727203981.16002: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203981.16095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203981.16164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.16263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203981.16339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203981.16610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203981.18360: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203981.18461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203981.18545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpy5ydwyqj /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py <<< 25675 1727203981.18548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py" <<< 25675 1727203981.18601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpy5ydwyqj" to remote "/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py" <<< 25675 1727203981.22589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203981.22595: stdout chunk (state=3): >>><<< 25675 1727203981.22598: stderr chunk (state=3): >>><<< 25675 1727203981.22600: done transferring module to remote 25675 1727203981.22602: _low_level_execute_command(): starting 25675 1727203981.22604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/ /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py && sleep 0' 25675 1727203981.24138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203981.24242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.24372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203981.24396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203981.24597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203981.24678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203981.26727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203981.26814: stderr chunk (state=3): >>><<< 25675 1727203981.26818: stdout chunk (state=3): >>><<< 25675 1727203981.26840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203981.27057: _low_level_execute_command(): starting 25675 1727203981.27061: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/AnsiballZ_setup.py && sleep 0' 25675 1727203981.28227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203981.28247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203981.28264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203981.28298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.28391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203981.28523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203981.28665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203981.31001: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 25675 1727203981.31017: stdout chunk (state=3): >>>import _imp # builtin <<< 25675 1727203981.31040: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 25675 1727203981.31109: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 25675 1727203981.31139: stdout chunk (state=3): >>>import 'posix' # <<< 25675 1727203981.31174: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 25675 1727203981.31195: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 25675 1727203981.31215: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 25675 1727203981.31248: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 25675 1727203981.31266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.31280: stdout chunk (state=3): >>>import '_codecs' # <<< 25675 1727203981.31295: stdout chunk (state=3): >>>import 'codecs' # <<< 25675 1727203981.31482: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 25675 1727203981.31487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 25675 1727203981.31490: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530dc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530abb00> <<< 25675 1727203981.31493: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 25675 1727203981.31495: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530dea50> <<< 25675 1727203981.31497: stdout chunk (state=3): >>>import '_signal' # <<< 25675 1727203981.31500: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 25675 1727203981.31577: stdout chunk (state=3): >>>import '_collections_abc' # <<< 25675 1727203981.31603: stdout chunk (state=3): >>>import 'genericpath' # <<< 25675 1727203981.31615: stdout chunk (state=3): >>>import 'posixpath' # <<< 25675 1727203981.31631: stdout chunk (state=3): >>>import 'os' # <<< 25675 1727203981.31883: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 25675 1727203981.31886: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 25675 1727203981.31889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 25675 1727203981.31897: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52e91130> <<< 25675 1727203981.31900: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52e92060> <<< 25675 1727203981.31902: stdout chunk (state=3): >>>import 'site' # <<< 25675 1727203981.31990: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25675 1727203981.32347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 25675 1727203981.32611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecfe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 25675 1727203981.32632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f078c0> <<< 25675 1727203981.32658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25675 1727203981.32673: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f07f50> <<< 25675 1727203981.32688: stdout chunk (state=3): >>>import '_collections' # <<< 25675 1727203981.32729: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee7b60> <<< 25675 1727203981.32749: stdout chunk (state=3): >>>import '_functools' # <<< 25675 1727203981.32771: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee5280> <<< 25675 1727203981.32980: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecd040> <<< 25675 1727203981.32984: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 25675 1727203981.32987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 25675 1727203981.32989: stdout chunk (state=3): >>>import '_sre' # <<< 25675 1727203981.32991: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25675 1727203981.32994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 25675 1727203981.32996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 25675 1727203981.33024: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f2b800> <<< 25675 1727203981.33041: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f2a420> <<< 25675 1727203981.33088: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 25675 1727203981.33108: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee6150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ece000> <<< 25675 1727203981.33196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.33202: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f5cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5cbc0> <<< 25675 1727203981.33250: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f5cfb0> <<< 25675 1727203981.33259: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecade0> <<< 25675 1727203981.33283: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 25675 1727203981.33290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.33306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 25675 1727203981.33585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5d670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5d340> <<< 25675 1727203981.33596: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 25675 1727203981.33602: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 25675 1727203981.33605: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5e570> import 'importlib.util' # import 'runpy' # <<< 25675 1727203981.33607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 25675 1727203981.33612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.33615: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f75e80> <<< 25675 1727203981.33621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 25675 1727203981.33627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 25675 1727203981.33636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f76d20> <<< 25675 1727203981.33713: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f77350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f76270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 25675 1727203981.33728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 25675 1727203981.33781: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.33787: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f77d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f774a0> <<< 25675 1727203981.33864: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5e5d0> <<< 25675 1727203981.33872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25675 1727203981.33898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 25675 1727203981.33936: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c67c80> <<< 25675 1727203981.33955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 25675 1727203981.33996: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c90710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c90470> <<< 25675 1727203981.34123: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c90740> <<< 25675 1727203981.34127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25675 1727203981.34129: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.34282: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c91070> <<< 25675 1727203981.34561: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c91a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c90920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c65e20> <<< 25675 1727203981.34564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c92e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c91b80> <<< 25675 1727203981.34566: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5ecc0> <<< 25675 1727203981.34571: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25675 1727203981.34620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.34631: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 25675 1727203981.34885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52cbb1d0> <<< 25675 1727203981.34897: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 25675 1727203981.34900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25675 1727203981.34902: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ce3590> <<< 25675 1727203981.34904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25675 1727203981.34907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25675 1727203981.34993: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d40380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 25675 1727203981.35208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 25675 1727203981.35226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d42ab0> <<< 25675 1727203981.35240: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d40470> <<< 25675 1727203981.35284: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d093a0> <<< 25675 1727203981.35299: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 25675 1727203981.35325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b413d0> <<< 25675 1727203981.35338: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ce2390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c93da0> <<< 25675 1727203981.35549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faa52b41640> <<< 25675 1727203981.35886: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_guq_f69m/ansible_ansible.legacy.setup_payload.zip' <<< 25675 1727203981.35889: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.35901: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.35933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 25675 1727203981.35979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25675 1727203981.35995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25675 1727203981.36141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba70e0> import '_typing' # <<< 25675 1727203981.36292: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b85fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b85190> # zipimport: zlib available <<< 25675 1727203981.36408: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 25675 1727203981.36424: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 25675 1727203981.37796: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.39105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba4f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd6b10> <<< 25675 1727203981.39205: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd68a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd61b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd6b70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba7b00> import 'atexit' # <<< 25675 1727203981.39217: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.39240: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd7800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.39264: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd79b0> <<< 25675 1727203981.39626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd7ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52525c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52527890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52528290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52529430> <<< 25675 1727203981.39655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25675 1727203981.39679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 25675 1727203981.39698: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 25675 1727203981.39884: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252bef0> <<< 25675 1727203981.39888: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.39890: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52530230> <<< 25675 1727203981.39893: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252a1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25675 1727203981.39895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25675 1727203981.39897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25675 1727203981.39917: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25675 1727203981.40009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 25675 1727203981.40147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52533e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52532960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525326c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25675 1727203981.40225: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52532c30> <<< 25675 1727203981.40244: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252a6c0> <<< 25675 1727203981.40282: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52577fe0> <<< 25675 1727203981.40364: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 25675 1727203981.40392: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.40411: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52579be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525799a0> <<< 25675 1727203981.40421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 25675 1727203981.40449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 25675 1727203981.40596: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5257c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 25675 1727203981.40602: stdout chunk (state=3): >>>import '_string' # <<< 25675 1727203981.40784: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257f890> <<< 25675 1727203981.40787: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257c260> <<< 25675 1727203981.40822: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52580920> <<< 25675 1727203981.40851: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.40910: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa525806e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52580a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52578350> <<< 25675 1727203981.40930: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 25675 1727203981.40945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25675 1727203981.40965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 25675 1727203981.41208: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.41212: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5240c0b0> <<< 25675 1727203981.41220: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.41226: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5240d520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52582840> <<< 25675 1727203981.41229: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.41239: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52583bc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525824b0> <<< 25675 1727203981.41253: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 25675 1727203981.41492: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.41495: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.41498: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 25675 1727203981.41500: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.41630: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.41745: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.42286: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.42807: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 25675 1727203981.42892: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 25675 1727203981.42923: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52411550> <<< 25675 1727203981.43012: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 25675 1727203981.43016: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52412330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5240d640> <<< 25675 1727203981.43078: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 25675 1727203981.43182: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.43190: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 25675 1727203981.43262: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.43425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 25675 1727203981.43459: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52412150> # zipimport: zlib available <<< 25675 1727203981.44057: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44324: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44550: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 25675 1727203981.44554: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44557: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 25675 1727203981.44559: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44624: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44739: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 25675 1727203981.44742: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44744: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 25675 1727203981.44794: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.44880: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 25675 1727203981.44883: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.45057: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.45293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25675 1727203981.45391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25675 1727203981.45406: stdout chunk (state=3): >>>import '_ast' # <<< 25675 1727203981.45424: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524133b0> # zipimport: zlib available <<< 25675 1727203981.45527: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.45574: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 25675 1727203981.45731: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203981.45790: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.45852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.45901: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25675 1727203981.45956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.46038: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5241de50> <<< 25675 1727203981.46083: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52418da0> <<< 25675 1727203981.46108: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 25675 1727203981.46168: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.46231: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.46255: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.46324: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.46395: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 25675 1727203981.46444: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25675 1727203981.46473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 25675 1727203981.46486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25675 1727203981.46511: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52506870> <<< 25675 1727203981.46585: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bf6540> <<< 25675 1727203981.46716: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5241e060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52581130> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 25675 1727203981.46742: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 25675 1727203981.46766: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 25675 1727203981.46794: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 25675 1727203981.46853: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47119: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 25675 1727203981.47178: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47250: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47268: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47304: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 25675 1727203981.47496: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47654: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47699: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.47762: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.48094: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b1f10> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 25675 1727203981.48135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5208bdd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52090170> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5249a750> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b2ab0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b0620> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b1010> <<< 25675 1727203981.48144: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 25675 1727203981.48193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 25675 1727203981.48227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 25675 1727203981.48248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 25675 1727203981.48354: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52093110> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520929c0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52092ba0> <<< 25675 1727203981.48367: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52091df0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 25675 1727203981.48509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52093290> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 25675 1727203981.48572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa520fdd90> <<< 25675 1727203981.48617: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52093d70> <<< 25675 1727203981.48622: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b01d0> import 'ansible.module_utils.facts.timeout' # <<< 25675 1727203981.48690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 25675 1727203981.48714: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.48790: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 25675 1727203981.48847: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.48897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 25675 1727203981.49012: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 25675 1727203981.49065: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 25675 1727203981.49134: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49162: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 25675 1727203981.49242: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49281: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49343: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49465: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 25675 1727203981.49483: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.49956: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.50507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203981.50534: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.50578: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 25675 1727203981.50610: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.50633: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.50657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 25675 1727203981.50727: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.50774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 25675 1727203981.50921: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51057: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 25675 1727203981.51097: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 25675 1727203981.51135: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520fda90> <<< 25675 1727203981.51163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 25675 1727203981.51304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 25675 1727203981.51331: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520feae0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 25675 1727203981.51391: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 25675 1727203981.51549: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 25675 1727203981.51715: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 25675 1727203981.51836: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51838: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.51930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 25675 1727203981.51934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 25675 1727203981.52002: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.52058: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5212e1e0> <<< 25675 1727203981.52264: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5211f0b0> import 'ansible.module_utils.facts.system.python' # <<< 25675 1727203981.52268: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52320: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 25675 1727203981.52484: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52546: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52657: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52817: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 25675 1727203981.52861: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 25675 1727203981.52914: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52944: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.52996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 25675 1727203981.53315: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.53318: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203981.53329: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52145f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5211f980> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 25675 1727203981.53378: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53625: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203981.53715: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53749: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 25675 1727203981.53832: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53865: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.53996: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.54212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 25675 1727203981.54406: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 25675 1727203981.54453: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.54489: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.55055: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.55587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 25675 1727203981.55613: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.55730: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.55985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 25675 1727203981.55988: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.55993: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 25675 1727203981.56166: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 25675 1727203981.56351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 25675 1727203981.56402: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56409: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 25675 1727203981.56561: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56656: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.56862: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 25675 1727203981.57091: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57120: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 25675 1727203981.57177: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57223: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 25675 1727203981.57238: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57299: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 25675 1727203981.57393: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57431: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 25675 1727203981.57499: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 25675 1727203981.57630: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 25675 1727203981.57713: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.57965: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 25675 1727203981.58296: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 25675 1727203981.58384: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58398: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 25675 1727203981.58469: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58508: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 25675 1727203981.58529: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58553: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58599: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 25675 1727203981.58614: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.58707: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 25675 1727203981.59027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203981.59044: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59123: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 25675 1727203981.59251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203981.59306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 25675 1727203981.59348: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59510: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 25675 1727203981.59718: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59771: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59818: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 25675 1727203981.59862: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.59906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 25675 1727203981.59948: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.60006: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.60101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 25675 1727203981.60187: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.60287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 25675 1727203981.60365: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203981.60560: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 25675 1727203981.60586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 25675 1727203981.60634: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa51f42870> <<< 25675 1727203981.60665: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f42de0> <<< 25675 1727203981.60691: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f404a0> <<< 25675 1727203981.72555: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 25675 1727203981.72588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f8ad20> <<< 25675 1727203981.72626: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 25675 1727203981.72649: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f89040> <<< 25675 1727203981.72703: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 25675 1727203981.72708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203981.72753: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f8a540> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f89f40> <<< 25675 1727203981.73021: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 25675 1727203981.97018: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 567, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785718784, "block_size": 4096, "block_total": 65519099, "block_available": 63912529, "block_used": 1606570, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "01", "epoch": "1727203981", "epoch_int": "1727203981", "date": "2024-09-24", "time": "14:53:01", "iso8601_micro": "2024-09-24T18:53:01.966461Z", "iso8601": "2024-09-24T18:53:01Z", "iso8601_basic": "20240924T145301966461", "iso8601_basic_short": "20240924T145301", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.41015625, "5m": 0.40673828125, "15m": 0.2158203125}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727203981.97637: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 25675 1727203981.97722: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset <<< 25675 1727203981.97805: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp <<< 25675 1727203981.97829: stdout chunk (state=3): >>># cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common<<< 25675 1727203981.97860: stdout chunk (state=3): >>> # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 25675 1727203981.97922: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale <<< 25675 1727203981.97926: stdout chunk (state=3): >>># cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext <<< 25675 1727203981.98037: stdout chunk (state=3): >>># destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix <<< 25675 1727203981.98040: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl <<< 25675 1727203981.98051: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 25675 1727203981.98084: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 25675 1727203981.98422: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25675 1727203981.98426: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 25675 1727203981.98465: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 25675 1727203981.98497: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 25675 1727203981.98549: stdout chunk (state=3): >>># destroy ntpath <<< 25675 1727203981.98572: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 25675 1727203981.98610: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 25675 1727203981.98624: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 25675 1727203981.98669: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 25675 1727203981.98686: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 25675 1727203981.98748: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 25675 1727203981.98774: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 25675 1727203981.98809: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 25675 1727203981.98839: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 25675 1727203981.98902: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 25675 1727203981.98908: stdout chunk (state=3): >>># destroy json <<< 25675 1727203981.98942: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 25675 1727203981.98987: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 25675 1727203981.99028: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 25675 1727203981.99084: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 25675 1727203981.99139: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 25675 1727203981.99159: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 25675 1727203981.99217: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 25675 1727203981.99230: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25675 1727203981.99379: stdout chunk (state=3): >>># destroy sys.monitoring <<< 25675 1727203981.99397: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 25675 1727203981.99454: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25675 1727203981.99460: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25675 1727203981.99499: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 25675 1727203981.99526: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 25675 1727203981.99550: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25675 1727203981.99645: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 25675 1727203981.99683: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 25675 1727203981.99741: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 25675 1727203981.99771: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 25675 1727203981.99774: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 25675 1727203982.00184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203982.00188: stdout chunk (state=3): >>><<< 25675 1727203982.00198: stderr chunk (state=3): >>><<< 25675 1727203982.00500: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530dc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530abb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa530dea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52e91130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52e92060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecfe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f078c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f07f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee7b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee5280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f2b800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f2a420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ee6150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ece000> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f5cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f5cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ecade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5d670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5d340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5e570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f75e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f76d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f77350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f76270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52f77d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f774a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5e5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c67c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c90710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c90470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c90740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c91070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52c91a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c90920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c65e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c92e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c91b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52f5ecc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52cbb1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ce3590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d40380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d42ab0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d40470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52d093a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b413d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ce2390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52c93da0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faa52b41640> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_guq_f69m/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba70e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b85fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52b85190> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba4f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd6b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd68a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd61b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd6b70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52ba7b00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd7800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52bd79b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bd7ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52525c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52527890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52528290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52529430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252bef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52530230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252a1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52533e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52532960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525326c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52532c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5252a6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52577fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52579be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525799a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5257c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257f890> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5257c260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52580920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa525806e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52580a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52578350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5240c0b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5240d520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52582840> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52583bc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa525824b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52411550> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52412330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5240d640> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52412150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524133b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5241de50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52418da0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52506870> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52bf6540> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5241e060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52581130> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b1f10> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5208bdd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52090170> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5249a750> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b2ab0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b0620> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b1010> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52093110> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520929c0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52092ba0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52091df0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52093290> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa520fdd90> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa52093d70> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa524b01d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520fda90> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa520feae0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa5212e1e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5211f0b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa52145f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa5211f980> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faa51f42870> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f42de0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f404a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f8ad20> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f89040> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f8a540> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faa51f89f40> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 567, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785718784, "block_size": 4096, "block_total": 65519099, "block_available": 63912529, "block_used": 1606570, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "01", "epoch": "1727203981", "epoch_int": "1727203981", "date": "2024-09-24", "time": "14:53:01", "iso8601_micro": "2024-09-24T18:53:01.966461Z", "iso8601": "2024-09-24T18:53:01Z", "iso8601_basic": "20240924T145301966461", "iso8601_basic_short": "20240924T145301", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.41015625, "5m": 0.40673828125, "15m": 0.2158203125}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 25675 1727203982.02730: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203982.02750: _low_level_execute_command(): starting 25675 1727203982.02780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203980.606273-25692-96108671387770/ > /dev/null 2>&1 && sleep 0' 25675 1727203982.03448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.03463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203982.03477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203982.03508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203982.03615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203982.03632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.03655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.03852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.05781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.05797: stdout chunk (state=3): >>><<< 25675 1727203982.05810: stderr chunk (state=3): >>><<< 25675 1727203982.05834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203982.05848: handler run complete 25675 1727203982.05987: variable 'ansible_facts' from source: unknown 25675 1727203982.06101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.06434: variable 'ansible_facts' from source: unknown 25675 1727203982.06581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.06629: attempt loop complete, returning result 25675 1727203982.06638: _execute() done 25675 1727203982.06644: dumping result to json 25675 1727203982.06676: done dumping result, returning 25675 1727203982.06691: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-00000000007c] 25675 1727203982.06700: sending task result for task 028d2410-947f-41bd-b19d-00000000007c 25675 1727203982.07478: done sending task result for task 028d2410-947f-41bd-b19d-00000000007c 25675 1727203982.07482: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727203982.07815: no more pending results, returning what we have 25675 1727203982.07819: results queue empty 25675 1727203982.07820: checking for any_errors_fatal 25675 1727203982.07821: done checking for any_errors_fatal 25675 1727203982.07822: checking for max_fail_percentage 25675 1727203982.07823: done checking for max_fail_percentage 25675 1727203982.07824: checking to see if all hosts have failed and the running result is not ok 25675 1727203982.07825: done checking to see if all hosts have failed 25675 1727203982.07825: getting the remaining hosts for this loop 25675 1727203982.07827: done getting the remaining hosts for this loop 25675 1727203982.07830: getting the next task for host managed-node2 25675 1727203982.07835: done getting next task for host managed-node2 25675 1727203982.07837: ^ task is: TASK: meta (flush_handlers) 25675 1727203982.07839: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203982.07842: getting variables 25675 1727203982.07844: in VariableManager get_vars() 25675 1727203982.07865: Calling all_inventory to load vars for managed-node2 25675 1727203982.07870: Calling groups_inventory to load vars for managed-node2 25675 1727203982.07873: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.07885: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.07888: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.07890: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.08102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.08325: done with get_vars() 25675 1727203982.08346: done getting variables 25675 1727203982.08417: in VariableManager get_vars() 25675 1727203982.08428: Calling all_inventory to load vars for managed-node2 25675 1727203982.08430: Calling groups_inventory to load vars for managed-node2 25675 1727203982.08432: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.08446: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.08449: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.08453: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.08633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.08830: done with get_vars() 25675 1727203982.08847: done queuing things up, now waiting for results queue to drain 25675 1727203982.08849: results queue empty 25675 1727203982.08850: checking for any_errors_fatal 25675 1727203982.08853: done checking for any_errors_fatal 25675 1727203982.08854: checking for max_fail_percentage 25675 1727203982.08860: done checking for max_fail_percentage 25675 1727203982.08861: checking to see if all hosts have failed and the running result is not ok 25675 1727203982.08862: done checking to see if all hosts have failed 25675 1727203982.08863: getting the remaining hosts for this loop 25675 1727203982.08864: done getting the remaining hosts for this loop 25675 1727203982.08867: getting the next task for host managed-node2 25675 1727203982.08886: done getting next task for host managed-node2 25675 1727203982.08890: ^ task is: TASK: Include the task 'el_repo_setup.yml' 25675 1727203982.08891: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203982.08893: getting variables 25675 1727203982.08894: in VariableManager get_vars() 25675 1727203982.08904: Calling all_inventory to load vars for managed-node2 25675 1727203982.08906: Calling groups_inventory to load vars for managed-node2 25675 1727203982.08908: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.08914: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.08915: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.08918: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.09088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.09286: done with get_vars() 25675 1727203982.09294: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Tuesday 24 September 2024 14:53:02 -0400 (0:00:01.534) 0:00:01.544 ***** 25675 1727203982.09379: entering _queue_task() for managed-node2/include_tasks 25675 1727203982.09381: Creating lock for include_tasks 25675 1727203982.09740: worker is 1 (out of 1 available) 25675 1727203982.09990: exiting _queue_task() for managed-node2/include_tasks 25675 1727203982.10001: done queuing things up, now waiting for results queue to drain 25675 1727203982.10002: waiting for pending results... 25675 1727203982.10031: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 25675 1727203982.10208: in run() - task 028d2410-947f-41bd-b19d-000000000006 25675 1727203982.10212: variable 'ansible_search_path' from source: unknown 25675 1727203982.10215: calling self._execute() 25675 1727203982.10312: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.10327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.10349: variable 'omit' from source: magic vars 25675 1727203982.10478: _execute() done 25675 1727203982.10487: dumping result to json 25675 1727203982.10532: done dumping result, returning 25675 1727203982.10535: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-41bd-b19d-000000000006] 25675 1727203982.10538: sending task result for task 028d2410-947f-41bd-b19d-000000000006 25675 1727203982.10806: no more pending results, returning what we have 25675 1727203982.10812: in VariableManager get_vars() 25675 1727203982.10858: Calling all_inventory to load vars for managed-node2 25675 1727203982.10861: Calling groups_inventory to load vars for managed-node2 25675 1727203982.10865: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.10874: done sending task result for task 028d2410-947f-41bd-b19d-000000000006 25675 1727203982.10880: WORKER PROCESS EXITING 25675 1727203982.10894: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.10898: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.10901: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.11308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.11544: done with get_vars() 25675 1727203982.11553: variable 'ansible_search_path' from source: unknown 25675 1727203982.11571: we have included files to process 25675 1727203982.11572: generating all_blocks data 25675 1727203982.11573: done generating all_blocks data 25675 1727203982.11574: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25675 1727203982.11577: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25675 1727203982.11580: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25675 1727203982.12179: in VariableManager get_vars() 25675 1727203982.12192: done with get_vars() 25675 1727203982.12200: done processing included file 25675 1727203982.12201: iterating over new_blocks loaded from include file 25675 1727203982.12203: in VariableManager get_vars() 25675 1727203982.12208: done with get_vars() 25675 1727203982.12209: filtering new block on tags 25675 1727203982.12218: done filtering new block on tags 25675 1727203982.12220: in VariableManager get_vars() 25675 1727203982.12226: done with get_vars() 25675 1727203982.12227: filtering new block on tags 25675 1727203982.12236: done filtering new block on tags 25675 1727203982.12237: in VariableManager get_vars() 25675 1727203982.12243: done with get_vars() 25675 1727203982.12244: filtering new block on tags 25675 1727203982.12252: done filtering new block on tags 25675 1727203982.12253: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 25675 1727203982.12257: extending task lists for all hosts with included blocks 25675 1727203982.12294: done extending task lists 25675 1727203982.12295: done processing included files 25675 1727203982.12295: results queue empty 25675 1727203982.12296: checking for any_errors_fatal 25675 1727203982.12297: done checking for any_errors_fatal 25675 1727203982.12297: checking for max_fail_percentage 25675 1727203982.12298: done checking for max_fail_percentage 25675 1727203982.12298: checking to see if all hosts have failed and the running result is not ok 25675 1727203982.12299: done checking to see if all hosts have failed 25675 1727203982.12299: getting the remaining hosts for this loop 25675 1727203982.12300: done getting the remaining hosts for this loop 25675 1727203982.12301: getting the next task for host managed-node2 25675 1727203982.12304: done getting next task for host managed-node2 25675 1727203982.12305: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 25675 1727203982.12307: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203982.12308: getting variables 25675 1727203982.12309: in VariableManager get_vars() 25675 1727203982.12316: Calling all_inventory to load vars for managed-node2 25675 1727203982.12317: Calling groups_inventory to load vars for managed-node2 25675 1727203982.12318: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.12323: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.12324: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.12326: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.12430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.12538: done with get_vars() 25675 1727203982.12544: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.032) 0:00:01.577 ***** 25675 1727203982.12597: entering _queue_task() for managed-node2/setup 25675 1727203982.12837: worker is 1 (out of 1 available) 25675 1727203982.12849: exiting _queue_task() for managed-node2/setup 25675 1727203982.12860: done queuing things up, now waiting for results queue to drain 25675 1727203982.12861: waiting for pending results... 25675 1727203982.13010: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 25675 1727203982.13077: in run() - task 028d2410-947f-41bd-b19d-00000000008d 25675 1727203982.13088: variable 'ansible_search_path' from source: unknown 25675 1727203982.13091: variable 'ansible_search_path' from source: unknown 25675 1727203982.13122: calling self._execute() 25675 1727203982.13175: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.13181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.13188: variable 'omit' from source: magic vars 25675 1727203982.13572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203982.15315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203982.15361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203982.15396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203982.15428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203982.15448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203982.15514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203982.15535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203982.15552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203982.15579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203982.15591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203982.15720: variable 'ansible_facts' from source: unknown 25675 1727203982.15763: variable 'network_test_required_facts' from source: task vars 25675 1727203982.15796: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 25675 1727203982.15801: variable 'omit' from source: magic vars 25675 1727203982.15833: variable 'omit' from source: magic vars 25675 1727203982.15857: variable 'omit' from source: magic vars 25675 1727203982.15880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203982.15901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203982.15916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203982.15935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203982.15938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203982.15962: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203982.15965: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.15967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.16033: Set connection var ansible_shell_type to sh 25675 1727203982.16038: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203982.16043: Set connection var ansible_timeout to 10 25675 1727203982.16053: Set connection var ansible_pipelining to False 25675 1727203982.16059: Set connection var ansible_shell_executable to /bin/sh 25675 1727203982.16062: Set connection var ansible_connection to ssh 25675 1727203982.16088: variable 'ansible_shell_executable' from source: unknown 25675 1727203982.16092: variable 'ansible_connection' from source: unknown 25675 1727203982.16094: variable 'ansible_module_compression' from source: unknown 25675 1727203982.16097: variable 'ansible_shell_type' from source: unknown 25675 1727203982.16099: variable 'ansible_shell_executable' from source: unknown 25675 1727203982.16101: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.16103: variable 'ansible_pipelining' from source: unknown 25675 1727203982.16105: variable 'ansible_timeout' from source: unknown 25675 1727203982.16110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.16222: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203982.16229: variable 'omit' from source: magic vars 25675 1727203982.16234: starting attempt loop 25675 1727203982.16237: running the handler 25675 1727203982.16248: _low_level_execute_command(): starting 25675 1727203982.16255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203982.16945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.17006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.18633: stdout chunk (state=3): >>>/root <<< 25675 1727203982.18794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.18798: stdout chunk (state=3): >>><<< 25675 1727203982.18801: stderr chunk (state=3): >>><<< 25675 1727203982.18823: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203982.18861: _low_level_execute_command(): starting 25675 1727203982.18900: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587 `" && echo ansible-tmp-1727203982.188381-25831-227296279139587="` echo /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587 `" ) && sleep 0' 25675 1727203982.19552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.19605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.19624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203982.19722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.19752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.19863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.21820: stdout chunk (state=3): >>>ansible-tmp-1727203982.188381-25831-227296279139587=/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587 <<< 25675 1727203982.21994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.21999: stdout chunk (state=3): >>><<< 25675 1727203982.22001: stderr chunk (state=3): >>><<< 25675 1727203982.22509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203982.188381-25831-227296279139587=/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203982.22512: variable 'ansible_module_compression' from source: unknown 25675 1727203982.22515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727203982.22559: variable 'ansible_facts' from source: unknown 25675 1727203982.23081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py 25675 1727203982.23424: Sending initial data 25675 1727203982.23435: Sent initial data (153 bytes) 25675 1727203982.24424: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.24494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.24546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203982.24564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.24590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.24711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.26354: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203982.26528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203982.26616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp84kmcexf /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py <<< 25675 1727203982.26619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py" <<< 25675 1727203982.26674: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp84kmcexf" to remote "/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py" <<< 25675 1727203982.29760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.29853: stderr chunk (state=3): >>><<< 25675 1727203982.29863: stdout chunk (state=3): >>><<< 25675 1727203982.29981: done transferring module to remote 25675 1727203982.29985: _low_level_execute_command(): starting 25675 1727203982.29987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/ /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py && sleep 0' 25675 1727203982.30537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.30552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203982.30583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203982.30603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203982.30620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203982.30719: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.30740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.30883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.32959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.33002: stderr chunk (state=3): >>><<< 25675 1727203982.33013: stdout chunk (state=3): >>><<< 25675 1727203982.33040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203982.33132: _low_level_execute_command(): starting 25675 1727203982.33135: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/AnsiballZ_setup.py && sleep 0' 25675 1727203982.33697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.33713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203982.33727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203982.33744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203982.33759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203982.33772: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203982.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.33814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203982.33826: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203982.33837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727203982.33890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.33938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203982.33956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.33982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.34098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.37551: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 25675 1727203982.37555: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 25675 1727203982.37605: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.37642: stdout chunk (state=3): >>>import '_codecs' # <<< 25675 1727203982.37645: stdout chunk (state=3): >>>import 'codecs' # <<< 25675 1727203982.37714: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 25675 1727203982.37718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 25675 1727203982.37736: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f37b30> <<< 25675 1727203982.37761: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f6aa50> <<< 25675 1727203982.37790: stdout chunk (state=3): >>>import '_signal' # <<< 25675 1727203982.37903: stdout chunk (state=3): >>>import '_abc' # <<< 25675 1727203982.38013: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 25675 1727203982.38020: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 25675 1727203982.38140: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d3d130> <<< 25675 1727203982.38182: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 25675 1727203982.38207: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d3e060> <<< 25675 1727203982.38248: stdout chunk (state=3): >>>import 'site' # <<< 25675 1727203982.38268: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25675 1727203982.38641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25675 1727203982.38667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 25675 1727203982.38802: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d7bf50> <<< 25675 1727203982.38819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 25675 1727203982.38841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 25675 1727203982.38865: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d900e0> <<< 25675 1727203982.38876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 25675 1727203982.38908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 25675 1727203982.38932: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 25675 1727203982.38987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.39109: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60db3920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60db3fb0> import '_collections' # <<< 25675 1727203982.39137: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d93bc0> import '_functools' # <<< 25675 1727203982.39174: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d91340> <<< 25675 1727203982.39265: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d79100> <<< 25675 1727203982.39297: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 25675 1727203982.39651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d921e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e04950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d78380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.39672: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e04e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e04cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e050a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d76ea0> <<< 25675 1727203982.39701: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.39758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 25675 1727203982.39779: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e05790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e05460> import 'importlib.machinery' # <<< 25675 1727203982.39813: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 25675 1727203982.39856: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e06660> import 'importlib.util' # <<< 25675 1727203982.39877: stdout chunk (state=3): >>>import 'runpy' # <<< 25675 1727203982.39914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 25675 1727203982.39957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 25675 1727203982.39984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e20890> <<< 25675 1727203982.40031: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.40037: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e21fd0> <<< 25675 1727203982.40052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 25675 1727203982.40106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 25675 1727203982.40117: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e22e70> <<< 25675 1727203982.40181: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e234a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e223c0> <<< 25675 1727203982.40203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 25675 1727203982.40264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e23e60> <<< 25675 1727203982.40282: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e23590> <<< 25675 1727203982.40334: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e066c0> <<< 25675 1727203982.40362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25675 1727203982.40413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 25675 1727203982.40434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 25675 1727203982.40485: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b17d40> <<< 25675 1727203982.40543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b44860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b445c0> <<< 25675 1727203982.40617: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b44890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25675 1727203982.40714: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.40918: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b451c0> <<< 25675 1727203982.41100: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b45b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b44a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b15ee0> <<< 25675 1727203982.41172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 25675 1727203982.41192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 25675 1727203982.41220: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b46f60> <<< 25675 1727203982.41223: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b45cd0> <<< 25675 1727203982.41251: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e06db0> <<< 25675 1727203982.41285: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25675 1727203982.41385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 25675 1727203982.41431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 25675 1727203982.41489: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b6b2c0> <<< 25675 1727203982.41538: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 25675 1727203982.41573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 25675 1727203982.41605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25675 1727203982.41686: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b8f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25675 1727203982.41743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25675 1727203982.41835: stdout chunk (state=3): >>>import 'ntpath' # <<< 25675 1727203982.41864: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf03b0> <<< 25675 1727203982.41928: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 25675 1727203982.41956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 25675 1727203982.42087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 25675 1727203982.42137: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf2ae0> <<< 25675 1727203982.42251: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf04a0> <<< 25675 1727203982.42291: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bb9430> <<< 25675 1727203982.42371: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60525460> <<< 25675 1727203982.42392: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b8e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b47ec0> <<< 25675 1727203982.42546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 25675 1727203982.42578: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9f60b8e540> <<< 25675 1727203982.43082: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_g3ty8bck/ansible_setup_payload.zip' # zipimport: zlib available <<< 25675 1727203982.43238: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25675 1727203982.43267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058f110> import '_typing' # <<< 25675 1727203982.43442: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6056e000> <<< 25675 1727203982.43481: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6056d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 25675 1727203982.43506: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.43558: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 25675 1727203982.43657: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.45118: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.46960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058cfe0> <<< 25675 1727203982.46991: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.47093: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bea50> <<< 25675 1727203982.47136: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be7e0> <<< 25675 1727203982.47186: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be0f0> <<< 25675 1727203982.47203: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 25675 1727203982.47222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 25675 1727203982.47300: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058fda0> import 'atexit' # <<< 25675 1727203982.47324: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bf7d0> <<< 25675 1727203982.47364: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bfa10> <<< 25675 1727203982.47385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 25675 1727203982.47440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 25675 1727203982.47466: stdout chunk (state=3): >>>import '_locale' # <<< 25675 1727203982.47517: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605bff50> import 'pwd' # <<< 25675 1727203982.47528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 25675 1727203982.47555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 25675 1727203982.47591: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60429d90> <<< 25675 1727203982.47630: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6042b9b0> <<< 25675 1727203982.47655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 25675 1727203982.47677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 25675 1727203982.47703: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042c380> <<< 25675 1727203982.47724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 25675 1727203982.47756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 25675 1727203982.47768: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042d280> <<< 25675 1727203982.47802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25675 1727203982.47850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 25675 1727203982.47853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 25675 1727203982.47864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 25675 1727203982.47912: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042ff50> <<< 25675 1727203982.47958: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e22de0> <<< 25675 1727203982.47962: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042e210> <<< 25675 1727203982.47985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25675 1727203982.48015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25675 1727203982.48060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25675 1727203982.48063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25675 1727203982.48175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 25675 1727203982.48216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 25675 1727203982.48220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60437e30> <<< 25675 1727203982.48238: stdout chunk (state=3): >>>import '_tokenize' # <<< 25675 1727203982.48640: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042e720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6047bef0> <<< 25675 1727203982.48725: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 25675 1727203982.48739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 25675 1727203982.48774: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6047dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047da60> <<< 25675 1727203982.48921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f604801d0> <<< 25675 1727203982.48930: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 25675 1727203982.48969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.49126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60483980> <<< 25675 1727203982.49284: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60480380> <<< 25675 1727203982.49504: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 25675 1727203982.49521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25675 1727203982.49564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 25675 1727203982.49600: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.49648: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603101a0><<< 25675 1727203982.49656: stdout chunk (state=3): >>> <<< 25675 1727203982.49910: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.49925: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.49944: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60311730> <<< 25675 1727203982.49958: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60486930><<< 25675 1727203982.49973: stdout chunk (state=3): >>> <<< 25675 1727203982.50006: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.50027: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60487ce0> <<< 25675 1727203982.50146: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60486540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 25675 1727203982.50242: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50377: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50405: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50419: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 25675 1727203982.50445: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50465: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50486: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 25675 1727203982.50511: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50709: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.50905: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.51923: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.52772: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 25675 1727203982.52786: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 25675 1727203982.52797: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 25675 1727203982.52816: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 25675 1727203982.52896: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.52952: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.52964: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.52972: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603158e0> <<< 25675 1727203982.53092: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 25675 1727203982.53201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60316660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60311580> import 'ansible.module_utils.compat.selinux' # <<< 25675 1727203982.53223: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.53256: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.53274: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 25675 1727203982.53300: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.53541: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.53814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 25675 1727203982.53827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60316630> <<< 25675 1727203982.53902: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.56256: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 25675 1727203982.56504: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.56706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25675 1727203982.56822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 25675 1727203982.56917: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60317980> <<< 25675 1727203982.56928: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57024: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57131: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 25675 1727203982.57148: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 25675 1727203982.57319: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 25675 1727203982.57338: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57393: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57467: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57567: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25675 1727203982.57619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.57743: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.57753: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603221e0> <<< 25675 1727203982.57800: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6031f2f0> <<< 25675 1727203982.57834: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 25675 1727203982.57842: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 25675 1727203982.57852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.57938: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58023: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58062: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58201: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25675 1727203982.58267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 25675 1727203982.58286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25675 1727203982.58314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25675 1727203982.58395: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6040ab70> <<< 25675 1727203982.58454: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605ea840> <<< 25675 1727203982.58565: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60322360> <<< 25675 1727203982.58572: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60485010> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 25675 1727203982.58596: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58798: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 25675 1727203982.58852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58933: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58961: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.58982: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59039: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59093: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59141: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 25675 1727203982.59197: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59300: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59408: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59431: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59483: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 25675 1727203982.59491: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.59762: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.60024: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.60081: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.60148: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 25675 1727203982.60152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203982.60304: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b25d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 25675 1727203982.60312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 25675 1727203982.60323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 25675 1727203982.60383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 25675 1727203982.60403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 25675 1727203982.60424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 25675 1727203982.60437: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff48200> <<< 25675 1727203982.60490: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.60496: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff48530> <<< 25675 1727203982.60562: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6039f3e0> <<< 25675 1727203982.60818: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b3140> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b0cb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b1730> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff4b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4ae10> <<< 25675 1727203982.60822: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff4aff0> <<< 25675 1727203982.60900: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 25675 1727203982.61039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 25675 1727203982.61057: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4b740> <<< 25675 1727203982.61078: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 25675 1727203982.61204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ffaa270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffa8290> <<< 25675 1727203982.61227: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b2390> import 'ansible.module_utils.facts.timeout' # <<< 25675 1727203982.61236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 25675 1727203982.61262: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61266: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61286: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 25675 1727203982.61294: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61379: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 25675 1727203982.61536: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61591: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 25675 1727203982.61615: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61618: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 25675 1727203982.61796: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.61836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 25675 1727203982.61841: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61900: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.61958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 25675 1727203982.61965: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.62039: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.62122: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.62199: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.62274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 25675 1727203982.62288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 25675 1727203982.62296: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.63064: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.63779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 25675 1727203982.63856: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.63927: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64010: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64015: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 25675 1727203982.64027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 25675 1727203982.64102: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 25675 1727203982.64108: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64178: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 25675 1727203982.64266: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64304: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64334: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 25675 1727203982.64357: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64393: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 25675 1727203982.64537: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.64658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 25675 1727203982.64664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 25675 1727203982.64698: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffaa420> <<< 25675 1727203982.64718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 25675 1727203982.64757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 25675 1727203982.64934: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffab080> <<< 25675 1727203982.64940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 25675 1727203982.65095: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.65125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 25675 1727203982.65131: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65263: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65386: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 25675 1727203982.65402: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65485: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 25675 1727203982.65593: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65650: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.65710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 25675 1727203982.65773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 25675 1727203982.65860: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.65950: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ffe6600> <<< 25675 1727203982.66305: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffd6db0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 25675 1727203982.66332: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.66413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 25675 1727203982.66418: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.66539: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.66653: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.66896: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67040: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 25675 1727203982.67045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 25675 1727203982.67103: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 25675 1727203982.67169: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67298: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 25675 1727203982.67315: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.67346: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5fffa240> <<< 25675 1727203982.67352: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fffbbf0> <<< 25675 1727203982.67365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 25675 1727203982.67370: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67391: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 25675 1727203982.67408: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67457: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 25675 1727203982.67755: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.67979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 25675 1727203982.67995: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68132: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68282: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68332: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 25675 1727203982.68412: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68594: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.68660: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.68868: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 25675 1727203982.68873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 25675 1727203982.68887: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.69066: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.69245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 25675 1727203982.69262: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.69298: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.69350: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.70216: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.71018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 25675 1727203982.71026: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.71172: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.71325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 25675 1727203982.71331: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.71481: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.71798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 25675 1727203982.71855: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 25675 1727203982.72115: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72125: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 25675 1727203982.72139: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72193: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 25675 1727203982.72394: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72539: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.72845: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 25675 1727203982.73177: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73215: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 25675 1727203982.73272: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73305: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.73525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 25675 1727203982.73532: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73562: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 25675 1727203982.73606: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73680: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 25675 1727203982.73767: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73847: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.73917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 25675 1727203982.73935: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.74337: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.74741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 25675 1727203982.74747: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.74830: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.74906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 25675 1727203982.74919: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75002: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 25675 1727203982.75015: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75058: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 25675 1727203982.75198: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 25675 1727203982.75354: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75460: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 25675 1727203982.75479: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75490: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 25675 1727203982.75509: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75565: stdout chunk (state=3): >>># zipimport: zlib available<<< 25675 1727203982.75584: stdout chunk (state=3): >>> <<< 25675 1727203982.75809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203982.75872: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.75999: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.76094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 25675 1727203982.76139: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 25675 1727203982.76150: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.76220: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.76296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 25675 1727203982.76320: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.76644: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.76952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 25675 1727203982.76966: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.77042: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.77299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 25675 1727203982.77402: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.77544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 25675 1727203982.77547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 25675 1727203982.77577: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.77698: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.77828: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 25675 1727203982.77833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 25675 1727203982.77851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 25675 1727203982.77973: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203982.79417: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 25675 1727203982.79438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 25675 1727203982.79443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 25675 1727203982.79445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 25675 1727203982.79503: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.79507: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203982.79527: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5fdf7860> <<< 25675 1727203982.79552: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fdf4140> <<< 25675 1727203982.79852: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fdf52b0> <<< 25675 1727203982.80015: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "02", "epoch": "1727203982", "epoch_int": "1727203982", "date": "2024-09-24", "time": "14:53:02", "iso8601_micro": "2024-09-24T18:53:02.782235Z", "iso8601": "2024-09-24T18:53:02Z", "iso8601_basic": "20240924T145302782235", "iso8601_basic_short": "20240924T145302", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727203982.80754: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 25675 1727203982.80759: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 25675 1727203982.80819: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 25675 1727203982.80838: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl <<< 25675 1727203982.80895: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd <<< 25675 1727203982.80971: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 25675 1727203982.81483: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 25675 1727203982.81546: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 25675 1727203982.81581: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 25675 1727203982.81662: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 25675 1727203982.81665: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 25675 1727203982.81696: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 25675 1727203982.81744: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 25675 1727203982.81754: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser <<< 25675 1727203982.81798: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 25675 1727203982.81826: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 25675 1727203982.81847: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 25675 1727203982.81890: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 25675 1727203982.81916: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25675 1727203982.82174: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25675 1727203982.82187: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25675 1727203982.82212: stdout chunk (state=3): >>># destroy _typing <<< 25675 1727203982.82245: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 25675 1727203982.82307: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25675 1727203982.82397: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 25675 1727203982.82413: stdout chunk (state=3): >>># destroy time <<< 25675 1727203982.82435: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 25675 1727203982.82478: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 25675 1727203982.82492: stdout chunk (state=3): >>># destroy itertools <<< 25675 1727203982.82527: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 25675 1727203982.82554: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 25675 1727203982.82704: stdout chunk (state=3): >>> <<< 25675 1727203982.83178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203982.83182: stdout chunk (state=3): >>><<< 25675 1727203982.83185: stderr chunk (state=3): >>><<< 25675 1727203982.83371: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60f6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d3d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d3e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d7bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d900e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60db3920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60db3fb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d93bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d91340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d79100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d921e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60dd4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e04950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d78380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e04e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e04cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e050a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60d76ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e05790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e05460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e06660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e20890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e21fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e22e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e234a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e223c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e23e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e23590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e066c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b17d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b44860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b445c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b44890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b451c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60b45b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b44a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b15ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b46f60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b45cd0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60e06db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b6b2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b8f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf03b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bf04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60bb9430> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60525460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b8e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60b47ec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9f60b8e540> # zipimport: found 103 names in '/tmp/ansible_setup_payload_g3ty8bck/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058f110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6056e000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6056d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058cfe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605be870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6058fda0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bf7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f605bfa10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605bff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60429d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6042b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042c380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042d280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60e22de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042e210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60437e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60436bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6042e720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6047bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f6047dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f604801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60483980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60480380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60484ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6047c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603101a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60311730> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60486930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f60487ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60486540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603158e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60316660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60311580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60316630> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60317980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f603221e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6031f2f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6040ab70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f605ea840> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60322360> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f60485010> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b25d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff48200> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff48530> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f6039f3e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b3140> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b0cb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b1730> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff4b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4ae10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ff4aff0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ff4b740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ffaa270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffa8290> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f603b2390> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffaa420> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffab080> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5ffe6600> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5ffd6db0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5fffa240> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fffbbf0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f5fdf7860> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fdf4140> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f5fdf52b0> {"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "02", "epoch": "1727203982", "epoch_int": "1727203982", "date": "2024-09-24", "time": "14:53:02", "iso8601_micro": "2024-09-24T18:53:02.782235Z", "iso8601": "2024-09-24T18:53:02Z", "iso8601_basic": "20240924T145302782235", "iso8601_basic_short": "20240924T145302", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 25675 1727203982.84913: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203982.84916: _low_level_execute_command(): starting 25675 1727203982.84919: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203982.188381-25831-227296279139587/ > /dev/null 2>&1 && sleep 0' 25675 1727203982.85052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.85057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.85286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.85324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203982.87843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203982.87848: stdout chunk (state=3): >>><<< 25675 1727203982.87851: stderr chunk (state=3): >>><<< 25675 1727203982.87883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203982.87895: handler run complete 25675 1727203982.87940: variable 'ansible_facts' from source: unknown 25675 1727203982.88087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.88135: variable 'ansible_facts' from source: unknown 25675 1727203982.88507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.88571: attempt loop complete, returning result 25675 1727203982.88585: _execute() done 25675 1727203982.88592: dumping result to json 25675 1727203982.88607: done dumping result, returning 25675 1727203982.88619: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-41bd-b19d-00000000008d] 25675 1727203982.88629: sending task result for task 028d2410-947f-41bd-b19d-00000000008d 25675 1727203982.88969: done sending task result for task 028d2410-947f-41bd-b19d-00000000008d 25675 1727203982.88972: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727203982.89199: no more pending results, returning what we have 25675 1727203982.89202: results queue empty 25675 1727203982.89203: checking for any_errors_fatal 25675 1727203982.89205: done checking for any_errors_fatal 25675 1727203982.89205: checking for max_fail_percentage 25675 1727203982.89207: done checking for max_fail_percentage 25675 1727203982.89208: checking to see if all hosts have failed and the running result is not ok 25675 1727203982.89209: done checking to see if all hosts have failed 25675 1727203982.89209: getting the remaining hosts for this loop 25675 1727203982.89211: done getting the remaining hosts for this loop 25675 1727203982.89215: getting the next task for host managed-node2 25675 1727203982.89224: done getting next task for host managed-node2 25675 1727203982.89227: ^ task is: TASK: Check if system is ostree 25675 1727203982.89230: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203982.89234: getting variables 25675 1727203982.89236: in VariableManager get_vars() 25675 1727203982.89265: Calling all_inventory to load vars for managed-node2 25675 1727203982.89268: Calling groups_inventory to load vars for managed-node2 25675 1727203982.89271: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203982.89589: Calling all_plugins_play to load vars for managed-node2 25675 1727203982.89593: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203982.89599: Calling groups_plugins_play to load vars for managed-node2 25675 1727203982.90034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203982.90501: done with get_vars() 25675 1727203982.90514: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.781) 0:00:02.358 ***** 25675 1727203982.90730: entering _queue_task() for managed-node2/stat 25675 1727203982.91730: worker is 1 (out of 1 available) 25675 1727203982.91743: exiting _queue_task() for managed-node2/stat 25675 1727203982.91869: done queuing things up, now waiting for results queue to drain 25675 1727203982.91872: waiting for pending results... 25675 1727203982.92376: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 25675 1727203982.92621: in run() - task 028d2410-947f-41bd-b19d-00000000008f 25675 1727203982.92625: variable 'ansible_search_path' from source: unknown 25675 1727203982.92628: variable 'ansible_search_path' from source: unknown 25675 1727203982.92630: calling self._execute() 25675 1727203982.92853: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.92945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.92949: variable 'omit' from source: magic vars 25675 1727203982.94257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203982.95090: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203982.95172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203982.95255: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203982.95305: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203982.95403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203982.95438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203982.95467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203982.95498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203982.95642: Evaluated conditional (not __network_is_ostree is defined): True 25675 1727203982.95652: variable 'omit' from source: magic vars 25675 1727203982.95741: variable 'omit' from source: magic vars 25675 1727203982.95744: variable 'omit' from source: magic vars 25675 1727203982.95762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203982.95794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203982.95821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203982.95854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203982.95874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203982.95912: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203982.95961: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.95964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.96042: Set connection var ansible_shell_type to sh 25675 1727203982.96053: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203982.96072: Set connection var ansible_timeout to 10 25675 1727203982.96086: Set connection var ansible_pipelining to False 25675 1727203982.96095: Set connection var ansible_shell_executable to /bin/sh 25675 1727203982.96180: Set connection var ansible_connection to ssh 25675 1727203982.96183: variable 'ansible_shell_executable' from source: unknown 25675 1727203982.96185: variable 'ansible_connection' from source: unknown 25675 1727203982.96187: variable 'ansible_module_compression' from source: unknown 25675 1727203982.96189: variable 'ansible_shell_type' from source: unknown 25675 1727203982.96191: variable 'ansible_shell_executable' from source: unknown 25675 1727203982.96192: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203982.96193: variable 'ansible_pipelining' from source: unknown 25675 1727203982.96195: variable 'ansible_timeout' from source: unknown 25675 1727203982.96197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203982.96320: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203982.96609: variable 'omit' from source: magic vars 25675 1727203982.96612: starting attempt loop 25675 1727203982.96615: running the handler 25675 1727203982.96617: _low_level_execute_command(): starting 25675 1727203982.96619: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203982.98000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203982.98099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203982.98284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203982.98577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203982.98796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.00527: stdout chunk (state=3): >>>/root <<< 25675 1727203983.00651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203983.00655: stdout chunk (state=3): >>><<< 25675 1727203983.00658: stderr chunk (state=3): >>><<< 25675 1727203983.00872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203983.00890: _low_level_execute_command(): starting 25675 1727203983.00894: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000 `" && echo ansible-tmp-1727203983.0070229-25862-29363630402000="` echo /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000 `" ) && sleep 0' 25675 1727203983.01661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.01665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.01671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203983.01674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.01722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.01740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.01827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.03761: stdout chunk (state=3): >>>ansible-tmp-1727203983.0070229-25862-29363630402000=/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000 <<< 25675 1727203983.03897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203983.03930: stderr chunk (state=3): >>><<< 25675 1727203983.03945: stdout chunk (state=3): >>><<< 25675 1727203983.03966: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203983.0070229-25862-29363630402000=/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203983.04121: variable 'ansible_module_compression' from source: unknown 25675 1727203983.04126: ANSIBALLZ: Using lock for stat 25675 1727203983.04128: ANSIBALLZ: Acquiring lock 25675 1727203983.04129: ANSIBALLZ: Lock acquired: 139822507791600 25675 1727203983.04131: ANSIBALLZ: Creating module 25675 1727203983.20158: ANSIBALLZ: Writing module into payload 25675 1727203983.20232: ANSIBALLZ: Writing module 25675 1727203983.20251: ANSIBALLZ: Renaming module 25675 1727203983.20254: ANSIBALLZ: Done creating module 25675 1727203983.20273: variable 'ansible_facts' from source: unknown 25675 1727203983.20331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py 25675 1727203983.20439: Sending initial data 25675 1727203983.20442: Sent initial data (152 bytes) 25675 1727203983.20925: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.20930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.20932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.20934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203983.20936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.21030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.21034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.21106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.23166: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203983.23236: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203983.23319: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpd8a__dlu /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py <<< 25675 1727203983.23323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py" <<< 25675 1727203983.23394: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 25675 1727203983.23397: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpd8a__dlu" to remote "/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py" <<< 25675 1727203983.24096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203983.24136: stderr chunk (state=3): >>><<< 25675 1727203983.24140: stdout chunk (state=3): >>><<< 25675 1727203983.24173: done transferring module to remote 25675 1727203983.24185: _low_level_execute_command(): starting 25675 1727203983.24190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/ /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py && sleep 0' 25675 1727203983.24648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203983.24652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.24687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.24690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203983.24692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.24743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203983.24746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.24749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.24832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.27492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203983.27513: stderr chunk (state=3): >>><<< 25675 1727203983.27516: stdout chunk (state=3): >>><<< 25675 1727203983.27530: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203983.27533: _low_level_execute_command(): starting 25675 1727203983.27539: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/AnsiballZ_stat.py && sleep 0' 25675 1727203983.28009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203983.28014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727203983.28017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.28062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203983.28065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.28067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.28156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.31507: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 25675 1727203983.31523: stdout chunk (state=3): >>>import 'posix' # <<< 25675 1727203983.31566: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 25675 1727203983.31599: stdout chunk (state=3): >>>import 'time' # <<< 25675 1727203983.31602: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 25675 1727203983.31679: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 25675 1727203983.31684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.31724: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 25675 1727203983.31783: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 25675 1727203983.31796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 25675 1727203983.31817: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a7684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a737b30> <<< 25675 1727203983.31841: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 25675 1727203983.31885: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a76aa50> import '_signal' # <<< 25675 1727203983.31911: stdout chunk (state=3): >>>import '_abc' # <<< 25675 1727203983.31915: stdout chunk (state=3): >>>import 'abc' # <<< 25675 1727203983.31995: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 25675 1727203983.32114: stdout chunk (state=3): >>>import '_collections_abc' # <<< 25675 1727203983.32147: stdout chunk (state=3): >>>import 'genericpath' # <<< 25675 1727203983.32152: stdout chunk (state=3): >>>import 'posixpath' # <<< 25675 1727203983.32188: stdout chunk (state=3): >>>import 'os' # <<< 25675 1727203983.32215: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 25675 1727203983.32231: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 25675 1727203983.32409: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a51d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 25675 1727203983.32413: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.32426: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a51e060> <<< 25675 1727203983.32455: stdout chunk (state=3): >>>import 'site' # <<< 25675 1727203983.32494: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25675 1727203983.32872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25675 1727203983.32882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 25675 1727203983.32908: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 25675 1727203983.32927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.32950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 25675 1727203983.33009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 25675 1727203983.33037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 25675 1727203983.33067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 25675 1727203983.33086: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a55bf50> <<< 25675 1727203983.33106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 25675 1727203983.33202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 25675 1727203983.33212: stdout chunk (state=3): >>> import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5700e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 25675 1727203983.33219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 25675 1727203983.33254: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 25675 1727203983.33258: stdout chunk (state=3): >>> <<< 25675 1727203983.33328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.33406: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a593920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25675 1727203983.33410: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a593fb0> <<< 25675 1727203983.33431: stdout chunk (state=3): >>>import '_collections' # <<< 25675 1727203983.33497: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a573bc0> <<< 25675 1727203983.33522: stdout chunk (state=3): >>>import '_functools' # <<< 25675 1727203983.33593: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a571340> <<< 25675 1727203983.33693: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a559100> <<< 25675 1727203983.33724: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 25675 1727203983.33754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 25675 1727203983.33767: stdout chunk (state=3): >>>import '_sre' # <<< 25675 1727203983.33795: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25675 1727203983.34009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5721e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 25675 1727203983.34013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e4950> <<< 25675 1727203983.34027: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a558380> <<< 25675 1727203983.34050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 25675 1727203983.34055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 25675 1727203983.34094: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.34098: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a5e4e00> <<< 25675 1727203983.34114: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e4cb0> <<< 25675 1727203983.34157: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.34174: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.34179: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a5e50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a556ea0> <<< 25675 1727203983.34216: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 25675 1727203983.34223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.34247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 25675 1727203983.34304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 25675 1727203983.34308: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e5790> <<< 25675 1727203983.34310: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e5460> <<< 25675 1727203983.34318: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 25675 1727203983.34355: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 25675 1727203983.34388: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e6660> <<< 25675 1727203983.34403: stdout chunk (state=3): >>>import 'importlib.util' # <<< 25675 1727203983.34418: stdout chunk (state=3): >>>import 'runpy' # <<< 25675 1727203983.34445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 25675 1727203983.34498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 25675 1727203983.34708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a600890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a601fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a602e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a6034a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a6023c0> <<< 25675 1727203983.34717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 25675 1727203983.34730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 25675 1727203983.34781: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.34790: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a603e60> <<< 25675 1727203983.34803: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a603590> <<< 25675 1727203983.34865: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e66c0> <<< 25675 1727203983.34885: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25675 1727203983.34928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 25675 1727203983.34948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 25675 1727203983.34979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 25675 1727203983.35016: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.35021: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a397d40> <<< 25675 1727203983.35055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 25675 1727203983.35095: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.35099: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c4860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c45c0> <<< 25675 1727203983.35137: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.35142: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c4890> <<< 25675 1727203983.35184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 25675 1727203983.35192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25675 1727203983.35332: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.35481: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c51c0> <<< 25675 1727203983.35805: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c5b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c4a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a395ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c6f60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c5cd0> <<< 25675 1727203983.35829: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e6db0> <<< 25675 1727203983.35859: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25675 1727203983.35946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.35967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 25675 1727203983.36020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 25675 1727203983.36054: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3eb2c0> <<< 25675 1727203983.36130: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 25675 1727203983.36148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.36177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 25675 1727203983.36204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25675 1727203983.36262: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a40f620> <<< 25675 1727203983.36284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25675 1727203983.36349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25675 1727203983.36442: stdout chunk (state=3): >>>import 'ntpath' # <<< 25675 1727203983.36478: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a4703b0> <<< 25675 1727203983.36700: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 25675 1727203983.36746: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a472ae0> <<< 25675 1727203983.36858: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a4704a0> <<< 25675 1727203983.36903: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a439430> <<< 25675 1727203983.36946: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 25675 1727203983.36956: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d29460> <<< 25675 1727203983.36977: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a40e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c7ec0> <<< 25675 1727203983.37145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 25675 1727203983.37183: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbf0a40e540> <<< 25675 1727203983.37392: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ji5ydppm/ansible_stat_payload.zip' # zipimport: zlib available <<< 25675 1727203983.37602: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.37713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25675 1727203983.37797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 25675 1727203983.37856: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7f0e0> <<< 25675 1727203983.37868: stdout chunk (state=3): >>>import '_typing' # <<< 25675 1727203983.38165: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d5dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d5d160> <<< 25675 1727203983.38190: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.38193: stdout chunk (state=3): >>>import 'ansible' # <<< 25675 1727203983.38219: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.38301: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 25675 1727203983.40472: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.42343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7cfb0> <<< 25675 1727203983.42382: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.42394: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 25675 1727203983.42510: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09daaa80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daa810> <<< 25675 1727203983.42547: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daa120> <<< 25675 1727203983.42587: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 25675 1727203983.42590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 25675 1727203983.42657: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daab70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7fd70> import 'atexit' # <<< 25675 1727203983.42695: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.42704: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09dab770> <<< 25675 1727203983.42755: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.42765: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09dab8c0> <<< 25675 1727203983.43011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dabe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 25675 1727203983.43027: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c11c40> <<< 25675 1727203983.43053: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c13860> <<< 25675 1727203983.43105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 25675 1727203983.43108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 25675 1727203983.43180: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c18230> <<< 25675 1727203983.43187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 25675 1727203983.43228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 25675 1727203983.43244: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c19130> <<< 25675 1727203983.43269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25675 1727203983.43326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 25675 1727203983.43354: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 25675 1727203983.43443: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1be60> <<< 25675 1727203983.43492: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c1bfb0> <<< 25675 1727203983.43530: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1a120> <<< 25675 1727203983.43547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25675 1727203983.43586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25675 1727203983.43613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25675 1727203983.43642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25675 1727203983.43685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 25675 1727203983.43717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 25675 1727203983.43751: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c23d70> <<< 25675 1727203983.43760: stdout chunk (state=3): >>>import '_tokenize' # <<< 25675 1727203983.43858: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c22840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c225a0> <<< 25675 1727203983.43886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25675 1727203983.44215: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c22b10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1a630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c67a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c68170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c69c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c699d0> <<< 25675 1727203983.44232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 25675 1727203983.44393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 25675 1727203983.44464: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c6c140> <<< 25675 1727203983.44467: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6a300> <<< 25675 1727203983.44489: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 25675 1727203983.44563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.44580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 25675 1727203983.44604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 25675 1727203983.44673: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6f920> <<< 25675 1727203983.44871: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6c2f0> <<< 25675 1727203983.44962: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70710> <<< 25675 1727203983.44997: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70b60> <<< 25675 1727203983.45141: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 25675 1727203983.45147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 25675 1727203983.45158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25675 1727203983.45174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 25675 1727203983.45204: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25675 1727203983.45244: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09cfc2f0> <<< 25675 1727203983.45722: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09cfd430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c72a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c73e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c726c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.45858: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.45892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 25675 1727203983.45928: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 25675 1727203983.45931: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.46109: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.46296: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.47219: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.48153: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 25675 1727203983.48160: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 25675 1727203983.48163: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 25675 1727203983.48190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.48295: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09b01610><<< 25675 1727203983.48309: stdout chunk (state=3): >>> <<< 25675 1727203983.48391: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 25675 1727203983.48403: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b02480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c23d40> <<< 25675 1727203983.48474: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 25675 1727203983.48491: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.48511: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.48531: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 25675 1727203983.48598: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.48787: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.49029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 25675 1727203983.49050: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b02450> <<< 25675 1727203983.49056: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.49821: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.50641: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.50645: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 25675 1727203983.50724: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.50814: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 25675 1727203983.50832: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 25675 1727203983.50854: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.50882: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.50912: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 25675 1727203983.50933: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.51146: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.51405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25675 1727203983.51451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25675 1727203983.51718: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b03560> # zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.51752: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 25675 1727203983.51767: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 25675 1727203983.51791: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.51845: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.51897: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 25675 1727203983.51907: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.51972: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.52095: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.52195: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25675 1727203983.52251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25675 1727203983.52371: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09b0dfd0> <<< 25675 1727203983.52419: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b0b0e0> <<< 25675 1727203983.52449: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 25675 1727203983.52466: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 25675 1727203983.52612: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.52636: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.52677: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.52754: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 25675 1727203983.52758: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 25675 1727203983.52773: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 25675 1727203983.52849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25675 1727203983.52878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 25675 1727203983.52908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25675 1727203983.52980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25675 1727203983.53003: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dfe930> <<< 25675 1727203983.53056: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dee600> <<< 25675 1727203983.53168: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b0e090> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c72ab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 25675 1727203983.53216: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.53245: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 25675 1727203983.53319: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 25675 1727203983.53348: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25675 1727203983.53354: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 25675 1727203983.53409: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.53574: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.53919: stdout chunk (state=3): >>># zipimport: zlib available <<< 25675 1727203983.54004: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 25675 1727203983.54024: stdout chunk (state=3): >>># destroy __main__ <<< 25675 1727203983.54585: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 25675 1727203983.54654: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 25675 1727203983.54683: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 25675 1727203983.54937: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25675 1727203983.54955: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 25675 1727203983.55054: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 25675 1727203983.55248: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 25675 1727203983.55486: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 25675 1727203983.55490: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 25675 1727203983.55493: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25675 1727203983.55499: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 25675 1727203983.55501: stdout chunk (state=3): >>># destroy _collections <<< 25675 1727203983.55503: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25675 1727203983.55505: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25675 1727203983.55521: stdout chunk (state=3): >>># destroy _typing <<< 25675 1727203983.55535: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 25675 1727203983.55550: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25675 1727203983.55633: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 25675 1727203983.55654: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 25675 1727203983.55680: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 25675 1727203983.55737: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 25675 1727203983.56123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203983.56126: stdout chunk (state=3): >>><<< 25675 1727203983.56128: stderr chunk (state=3): >>><<< 25675 1727203983.56209: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a7684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a737b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a76aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a51d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a51e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a55bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5700e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a593920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a593fb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a573bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a571340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a559100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5721e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5b4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a558380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a5e4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a5e50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a556ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a600890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a601fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a602e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a6034a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a6023c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a603e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a603590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a397d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c4860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c45c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c4890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c51c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf0a3c5b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c4a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a395ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c6f60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c5cd0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a5e6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3eb2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a40f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a4703b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a472ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a4704a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a439430> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d29460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a40e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf0a3c7ec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbf0a40e540> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ji5ydppm/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d5dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d5d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7cfb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09daaa80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daa810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daa120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09daab70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09d7fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09dab770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09dab8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dabe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c11c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c13860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c18230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c19130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1be60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c1bfb0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1a120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c23d70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c22840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c225a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c22b10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c1a630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c67a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c68170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c69c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c699d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c6c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6a300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6f920> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c6c2f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70710> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70b60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c70aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09cfc2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09cfd430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c72a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09c73e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c726c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09b01610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b02480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c23d40> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b02450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b03560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbf09b0dfd0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b0b0e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dfe930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09dee600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09b0e090> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbf09c72ab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 25675 1727203983.57003: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203983.57006: _low_level_execute_command(): starting 25675 1727203983.57008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203983.0070229-25862-29363630402000/ > /dev/null 2>&1 && sleep 0' 25675 1727203983.57895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203983.57899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.57902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203983.57905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203983.57907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203983.57909: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203983.57912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.57915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203983.57917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203983.57919: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727203983.57921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.57923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203983.57935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203983.58000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203983.58007: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203983.58067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.58213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.58329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203983.60853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203983.60957: stderr chunk (state=3): >>><<< 25675 1727203983.60961: stdout chunk (state=3): >>><<< 25675 1727203983.60964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203983.60966: handler run complete 25675 1727203983.60972: attempt loop complete, returning result 25675 1727203983.60976: _execute() done 25675 1727203983.60979: dumping result to json 25675 1727203983.60981: done dumping result, returning 25675 1727203983.60983: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [028d2410-947f-41bd-b19d-00000000008f] 25675 1727203983.60984: sending task result for task 028d2410-947f-41bd-b19d-00000000008f 25675 1727203983.61149: done sending task result for task 028d2410-947f-41bd-b19d-00000000008f 25675 1727203983.61152: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 25675 1727203983.61211: no more pending results, returning what we have 25675 1727203983.61214: results queue empty 25675 1727203983.61215: checking for any_errors_fatal 25675 1727203983.61220: done checking for any_errors_fatal 25675 1727203983.61221: checking for max_fail_percentage 25675 1727203983.61223: done checking for max_fail_percentage 25675 1727203983.61223: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.61224: done checking to see if all hosts have failed 25675 1727203983.61225: getting the remaining hosts for this loop 25675 1727203983.61226: done getting the remaining hosts for this loop 25675 1727203983.61229: getting the next task for host managed-node2 25675 1727203983.61234: done getting next task for host managed-node2 25675 1727203983.61237: ^ task is: TASK: Set flag to indicate system is ostree 25675 1727203983.61239: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.61242: getting variables 25675 1727203983.61244: in VariableManager get_vars() 25675 1727203983.61274: Calling all_inventory to load vars for managed-node2 25675 1727203983.61298: Calling groups_inventory to load vars for managed-node2 25675 1727203983.61301: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.61312: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.61314: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.61317: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.61520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.61722: done with get_vars() 25675 1727203983.61735: done getting variables 25675 1727203983.61846: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.711) 0:00:03.070 ***** 25675 1727203983.61888: entering _queue_task() for managed-node2/set_fact 25675 1727203983.61890: Creating lock for set_fact 25675 1727203983.62364: worker is 1 (out of 1 available) 25675 1727203983.62379: exiting _queue_task() for managed-node2/set_fact 25675 1727203983.62393: done queuing things up, now waiting for results queue to drain 25675 1727203983.62394: waiting for pending results... 25675 1727203983.62795: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 25675 1727203983.62800: in run() - task 028d2410-947f-41bd-b19d-000000000090 25675 1727203983.62803: variable 'ansible_search_path' from source: unknown 25675 1727203983.62805: variable 'ansible_search_path' from source: unknown 25675 1727203983.62841: calling self._execute() 25675 1727203983.62950: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.62954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.62957: variable 'omit' from source: magic vars 25675 1727203983.63421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203983.63614: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203983.63685: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203983.63688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203983.63691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203983.63763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203983.63783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203983.63806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203983.63828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203983.63922: Evaluated conditional (not __network_is_ostree is defined): True 25675 1727203983.63926: variable 'omit' from source: magic vars 25675 1727203983.63957: variable 'omit' from source: magic vars 25675 1727203983.64046: variable '__ostree_booted_stat' from source: set_fact 25675 1727203983.64086: variable 'omit' from source: magic vars 25675 1727203983.64107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203983.64128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203983.64146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203983.64160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.64169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.64195: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203983.64198: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.64201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.64307: Set connection var ansible_shell_type to sh 25675 1727203983.64311: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203983.64313: Set connection var ansible_timeout to 10 25675 1727203983.64316: Set connection var ansible_pipelining to False 25675 1727203983.64318: Set connection var ansible_shell_executable to /bin/sh 25675 1727203983.64320: Set connection var ansible_connection to ssh 25675 1727203983.64581: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.64585: variable 'ansible_connection' from source: unknown 25675 1727203983.64587: variable 'ansible_module_compression' from source: unknown 25675 1727203983.64589: variable 'ansible_shell_type' from source: unknown 25675 1727203983.64592: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.64595: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.64598: variable 'ansible_pipelining' from source: unknown 25675 1727203983.64600: variable 'ansible_timeout' from source: unknown 25675 1727203983.64603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.64608: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203983.64611: variable 'omit' from source: magic vars 25675 1727203983.64612: starting attempt loop 25675 1727203983.64614: running the handler 25675 1727203983.64616: handler run complete 25675 1727203983.64617: attempt loop complete, returning result 25675 1727203983.64619: _execute() done 25675 1727203983.64620: dumping result to json 25675 1727203983.64622: done dumping result, returning 25675 1727203983.64623: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [028d2410-947f-41bd-b19d-000000000090] 25675 1727203983.64625: sending task result for task 028d2410-947f-41bd-b19d-000000000090 25675 1727203983.64685: done sending task result for task 028d2410-947f-41bd-b19d-000000000090 25675 1727203983.64688: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 25675 1727203983.64740: no more pending results, returning what we have 25675 1727203983.64743: results queue empty 25675 1727203983.64744: checking for any_errors_fatal 25675 1727203983.64750: done checking for any_errors_fatal 25675 1727203983.64751: checking for max_fail_percentage 25675 1727203983.64752: done checking for max_fail_percentage 25675 1727203983.64753: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.64754: done checking to see if all hosts have failed 25675 1727203983.64754: getting the remaining hosts for this loop 25675 1727203983.64756: done getting the remaining hosts for this loop 25675 1727203983.64759: getting the next task for host managed-node2 25675 1727203983.64767: done getting next task for host managed-node2 25675 1727203983.64769: ^ task is: TASK: Fix CentOS6 Base repo 25675 1727203983.64771: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.64775: getting variables 25675 1727203983.64778: in VariableManager get_vars() 25675 1727203983.64803: Calling all_inventory to load vars for managed-node2 25675 1727203983.64805: Calling groups_inventory to load vars for managed-node2 25675 1727203983.64808: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.64817: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.64819: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.64828: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.65058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.65284: done with get_vars() 25675 1727203983.65297: done getting variables 25675 1727203983.65456: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.036) 0:00:03.106 ***** 25675 1727203983.65493: entering _queue_task() for managed-node2/copy 25675 1727203983.65882: worker is 1 (out of 1 available) 25675 1727203983.65894: exiting _queue_task() for managed-node2/copy 25675 1727203983.65907: done queuing things up, now waiting for results queue to drain 25675 1727203983.65908: waiting for pending results... 25675 1727203983.66151: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 25675 1727203983.66270: in run() - task 028d2410-947f-41bd-b19d-000000000092 25675 1727203983.66294: variable 'ansible_search_path' from source: unknown 25675 1727203983.66307: variable 'ansible_search_path' from source: unknown 25675 1727203983.66340: calling self._execute() 25675 1727203983.66419: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.66431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.66443: variable 'omit' from source: magic vars 25675 1727203983.66896: variable 'ansible_distribution' from source: facts 25675 1727203983.66923: Evaluated conditional (ansible_distribution == 'CentOS'): True 25675 1727203983.67038: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.67049: Evaluated conditional (ansible_distribution_major_version == '6'): False 25675 1727203983.67056: when evaluation is False, skipping this task 25675 1727203983.67064: _execute() done 25675 1727203983.67075: dumping result to json 25675 1727203983.67084: done dumping result, returning 25675 1727203983.67095: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [028d2410-947f-41bd-b19d-000000000092] 25675 1727203983.67104: sending task result for task 028d2410-947f-41bd-b19d-000000000092 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25675 1727203983.67268: no more pending results, returning what we have 25675 1727203983.67271: results queue empty 25675 1727203983.67272: checking for any_errors_fatal 25675 1727203983.67486: done checking for any_errors_fatal 25675 1727203983.67488: checking for max_fail_percentage 25675 1727203983.67490: done checking for max_fail_percentage 25675 1727203983.67490: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.67491: done checking to see if all hosts have failed 25675 1727203983.67492: getting the remaining hosts for this loop 25675 1727203983.67493: done getting the remaining hosts for this loop 25675 1727203983.67497: getting the next task for host managed-node2 25675 1727203983.67502: done getting next task for host managed-node2 25675 1727203983.67505: ^ task is: TASK: Include the task 'enable_epel.yml' 25675 1727203983.67508: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.67511: getting variables 25675 1727203983.67513: in VariableManager get_vars() 25675 1727203983.67537: Calling all_inventory to load vars for managed-node2 25675 1727203983.67540: Calling groups_inventory to load vars for managed-node2 25675 1727203983.67543: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.67557: done sending task result for task 028d2410-947f-41bd-b19d-000000000092 25675 1727203983.67560: WORKER PROCESS EXITING 25675 1727203983.67569: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.67572: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.67577: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.67744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.67966: done with get_vars() 25675 1727203983.67979: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.025) 0:00:03.131 ***** 25675 1727203983.68078: entering _queue_task() for managed-node2/include_tasks 25675 1727203983.68416: worker is 1 (out of 1 available) 25675 1727203983.68590: exiting _queue_task() for managed-node2/include_tasks 25675 1727203983.68600: done queuing things up, now waiting for results queue to drain 25675 1727203983.68601: waiting for pending results... 25675 1727203983.68727: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 25675 1727203983.68857: in run() - task 028d2410-947f-41bd-b19d-000000000093 25675 1727203983.68881: variable 'ansible_search_path' from source: unknown 25675 1727203983.68890: variable 'ansible_search_path' from source: unknown 25675 1727203983.68933: calling self._execute() 25675 1727203983.69028: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.69040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.69168: variable 'omit' from source: magic vars 25675 1727203983.69639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203983.72046: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203983.72183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203983.72187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203983.72230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203983.72265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203983.72364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203983.72402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203983.72445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203983.72492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203983.72547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203983.72880: variable '__network_is_ostree' from source: set_fact 25675 1727203983.72886: Evaluated conditional (not __network_is_ostree | d(false)): True 25675 1727203983.72888: _execute() done 25675 1727203983.72891: dumping result to json 25675 1727203983.72893: done dumping result, returning 25675 1727203983.72895: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [028d2410-947f-41bd-b19d-000000000093] 25675 1727203983.72897: sending task result for task 028d2410-947f-41bd-b19d-000000000093 25675 1727203983.72969: done sending task result for task 028d2410-947f-41bd-b19d-000000000093 25675 1727203983.72972: WORKER PROCESS EXITING 25675 1727203983.73012: no more pending results, returning what we have 25675 1727203983.73017: in VariableManager get_vars() 25675 1727203983.73051: Calling all_inventory to load vars for managed-node2 25675 1727203983.73054: Calling groups_inventory to load vars for managed-node2 25675 1727203983.73058: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.73073: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.73078: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.73081: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.73431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.73710: done with get_vars() 25675 1727203983.73719: variable 'ansible_search_path' from source: unknown 25675 1727203983.73720: variable 'ansible_search_path' from source: unknown 25675 1727203983.73768: we have included files to process 25675 1727203983.73769: generating all_blocks data 25675 1727203983.73771: done generating all_blocks data 25675 1727203983.73779: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25675 1727203983.73781: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25675 1727203983.73784: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25675 1727203983.74571: done processing included file 25675 1727203983.74574: iterating over new_blocks loaded from include file 25675 1727203983.74577: in VariableManager get_vars() 25675 1727203983.74590: done with get_vars() 25675 1727203983.74592: filtering new block on tags 25675 1727203983.74625: done filtering new block on tags 25675 1727203983.74628: in VariableManager get_vars() 25675 1727203983.74639: done with get_vars() 25675 1727203983.74641: filtering new block on tags 25675 1727203983.74652: done filtering new block on tags 25675 1727203983.74654: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 25675 1727203983.74660: extending task lists for all hosts with included blocks 25675 1727203983.74770: done extending task lists 25675 1727203983.74772: done processing included files 25675 1727203983.74772: results queue empty 25675 1727203983.74773: checking for any_errors_fatal 25675 1727203983.74778: done checking for any_errors_fatal 25675 1727203983.74779: checking for max_fail_percentage 25675 1727203983.74780: done checking for max_fail_percentage 25675 1727203983.74781: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.74781: done checking to see if all hosts have failed 25675 1727203983.74782: getting the remaining hosts for this loop 25675 1727203983.74783: done getting the remaining hosts for this loop 25675 1727203983.74786: getting the next task for host managed-node2 25675 1727203983.74789: done getting next task for host managed-node2 25675 1727203983.74791: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 25675 1727203983.74794: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.74796: getting variables 25675 1727203983.74797: in VariableManager get_vars() 25675 1727203983.74805: Calling all_inventory to load vars for managed-node2 25675 1727203983.74807: Calling groups_inventory to load vars for managed-node2 25675 1727203983.74810: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.74816: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.74824: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.74827: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.75005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.75215: done with get_vars() 25675 1727203983.75224: done getting variables 25675 1727203983.75305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 25675 1727203983.75523: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.074) 0:00:03.206 ***** 25675 1727203983.75571: entering _queue_task() for managed-node2/command 25675 1727203983.75573: Creating lock for command 25675 1727203983.76182: worker is 1 (out of 1 available) 25675 1727203983.76189: exiting _queue_task() for managed-node2/command 25675 1727203983.76197: done queuing things up, now waiting for results queue to drain 25675 1727203983.76198: waiting for pending results... 25675 1727203983.76227: running TaskExecutor() for managed-node2/TASK: Create EPEL 10 25675 1727203983.76360: in run() - task 028d2410-947f-41bd-b19d-0000000000ad 25675 1727203983.76380: variable 'ansible_search_path' from source: unknown 25675 1727203983.76387: variable 'ansible_search_path' from source: unknown 25675 1727203983.76437: calling self._execute() 25675 1727203983.76514: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.76536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.76555: variable 'omit' from source: magic vars 25675 1727203983.76992: variable 'ansible_distribution' from source: facts 25675 1727203983.77010: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25675 1727203983.77154: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.77166: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25675 1727203983.77174: when evaluation is False, skipping this task 25675 1727203983.77196: _execute() done 25675 1727203983.77204: dumping result to json 25675 1727203983.77281: done dumping result, returning 25675 1727203983.77284: done running TaskExecutor() for managed-node2/TASK: Create EPEL 10 [028d2410-947f-41bd-b19d-0000000000ad] 25675 1727203983.77287: sending task result for task 028d2410-947f-41bd-b19d-0000000000ad skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25675 1727203983.77462: no more pending results, returning what we have 25675 1727203983.77467: results queue empty 25675 1727203983.77468: checking for any_errors_fatal 25675 1727203983.77469: done checking for any_errors_fatal 25675 1727203983.77469: checking for max_fail_percentage 25675 1727203983.77471: done checking for max_fail_percentage 25675 1727203983.77472: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.77472: done checking to see if all hosts have failed 25675 1727203983.77473: getting the remaining hosts for this loop 25675 1727203983.77475: done getting the remaining hosts for this loop 25675 1727203983.77480: getting the next task for host managed-node2 25675 1727203983.77488: done getting next task for host managed-node2 25675 1727203983.77491: ^ task is: TASK: Install yum-utils package 25675 1727203983.77495: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.77500: getting variables 25675 1727203983.77502: in VariableManager get_vars() 25675 1727203983.77536: Calling all_inventory to load vars for managed-node2 25675 1727203983.77539: Calling groups_inventory to load vars for managed-node2 25675 1727203983.77543: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.77556: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.77560: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.77563: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.78021: done sending task result for task 028d2410-947f-41bd-b19d-0000000000ad 25675 1727203983.78026: WORKER PROCESS EXITING 25675 1727203983.78051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.78291: done with get_vars() 25675 1727203983.78301: done getting variables 25675 1727203983.78410: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.028) 0:00:03.235 ***** 25675 1727203983.78439: entering _queue_task() for managed-node2/package 25675 1727203983.78440: Creating lock for package 25675 1727203983.78767: worker is 1 (out of 1 available) 25675 1727203983.78896: exiting _queue_task() for managed-node2/package 25675 1727203983.78905: done queuing things up, now waiting for results queue to drain 25675 1727203983.78906: waiting for pending results... 25675 1727203983.79193: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 25675 1727203983.79305: in run() - task 028d2410-947f-41bd-b19d-0000000000ae 25675 1727203983.79334: variable 'ansible_search_path' from source: unknown 25675 1727203983.79355: variable 'ansible_search_path' from source: unknown 25675 1727203983.79388: calling self._execute() 25675 1727203983.79550: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.79553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.79556: variable 'omit' from source: magic vars 25675 1727203983.79909: variable 'ansible_distribution' from source: facts 25675 1727203983.79926: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25675 1727203983.80068: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.80083: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25675 1727203983.80100: when evaluation is False, skipping this task 25675 1727203983.80107: _execute() done 25675 1727203983.80118: dumping result to json 25675 1727203983.80125: done dumping result, returning 25675 1727203983.80134: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [028d2410-947f-41bd-b19d-0000000000ae] 25675 1727203983.80143: sending task result for task 028d2410-947f-41bd-b19d-0000000000ae 25675 1727203983.80271: done sending task result for task 028d2410-947f-41bd-b19d-0000000000ae 25675 1727203983.80274: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25675 1727203983.80443: no more pending results, returning what we have 25675 1727203983.80447: results queue empty 25675 1727203983.80448: checking for any_errors_fatal 25675 1727203983.80453: done checking for any_errors_fatal 25675 1727203983.80453: checking for max_fail_percentage 25675 1727203983.80455: done checking for max_fail_percentage 25675 1727203983.80456: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.80457: done checking to see if all hosts have failed 25675 1727203983.80457: getting the remaining hosts for this loop 25675 1727203983.80459: done getting the remaining hosts for this loop 25675 1727203983.80463: getting the next task for host managed-node2 25675 1727203983.80469: done getting next task for host managed-node2 25675 1727203983.80471: ^ task is: TASK: Enable EPEL 7 25675 1727203983.80478: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.80481: getting variables 25675 1727203983.80483: in VariableManager get_vars() 25675 1727203983.80513: Calling all_inventory to load vars for managed-node2 25675 1727203983.80515: Calling groups_inventory to load vars for managed-node2 25675 1727203983.80581: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.80594: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.80598: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.80602: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.80926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.81465: done with get_vars() 25675 1727203983.81481: done getting variables 25675 1727203983.81538: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.032) 0:00:03.268 ***** 25675 1727203983.81691: entering _queue_task() for managed-node2/command 25675 1727203983.82444: worker is 1 (out of 1 available) 25675 1727203983.82457: exiting _queue_task() for managed-node2/command 25675 1727203983.82468: done queuing things up, now waiting for results queue to drain 25675 1727203983.82469: waiting for pending results... 25675 1727203983.82986: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 25675 1727203983.82991: in run() - task 028d2410-947f-41bd-b19d-0000000000af 25675 1727203983.82994: variable 'ansible_search_path' from source: unknown 25675 1727203983.82996: variable 'ansible_search_path' from source: unknown 25675 1727203983.83102: calling self._execute() 25675 1727203983.83208: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.83212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.83214: variable 'omit' from source: magic vars 25675 1727203983.83653: variable 'ansible_distribution' from source: facts 25675 1727203983.83677: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25675 1727203983.83772: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.83782: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25675 1727203983.83787: when evaluation is False, skipping this task 25675 1727203983.83792: _execute() done 25675 1727203983.83796: dumping result to json 25675 1727203983.83801: done dumping result, returning 25675 1727203983.83808: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [028d2410-947f-41bd-b19d-0000000000af] 25675 1727203983.83814: sending task result for task 028d2410-947f-41bd-b19d-0000000000af skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25675 1727203983.83942: no more pending results, returning what we have 25675 1727203983.83946: results queue empty 25675 1727203983.83946: checking for any_errors_fatal 25675 1727203983.83951: done checking for any_errors_fatal 25675 1727203983.83952: checking for max_fail_percentage 25675 1727203983.83953: done checking for max_fail_percentage 25675 1727203983.83954: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.83955: done checking to see if all hosts have failed 25675 1727203983.83955: getting the remaining hosts for this loop 25675 1727203983.83957: done getting the remaining hosts for this loop 25675 1727203983.83960: getting the next task for host managed-node2 25675 1727203983.83978: done getting next task for host managed-node2 25675 1727203983.83981: ^ task is: TASK: Enable EPEL 8 25675 1727203983.83985: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.83989: getting variables 25675 1727203983.83991: in VariableManager get_vars() 25675 1727203983.84020: Calling all_inventory to load vars for managed-node2 25675 1727203983.84023: Calling groups_inventory to load vars for managed-node2 25675 1727203983.84026: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.84032: done sending task result for task 028d2410-947f-41bd-b19d-0000000000af 25675 1727203983.84034: WORKER PROCESS EXITING 25675 1727203983.84045: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.84048: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.84051: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.84232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.84349: done with get_vars() 25675 1727203983.84356: done getting variables 25675 1727203983.84406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.027) 0:00:03.295 ***** 25675 1727203983.84428: entering _queue_task() for managed-node2/command 25675 1727203983.84650: worker is 1 (out of 1 available) 25675 1727203983.84660: exiting _queue_task() for managed-node2/command 25675 1727203983.84672: done queuing things up, now waiting for results queue to drain 25675 1727203983.84673: waiting for pending results... 25675 1727203983.84812: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 25675 1727203983.84885: in run() - task 028d2410-947f-41bd-b19d-0000000000b0 25675 1727203983.84896: variable 'ansible_search_path' from source: unknown 25675 1727203983.84899: variable 'ansible_search_path' from source: unknown 25675 1727203983.84926: calling self._execute() 25675 1727203983.84985: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.84989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.84999: variable 'omit' from source: magic vars 25675 1727203983.85580: variable 'ansible_distribution' from source: facts 25675 1727203983.85584: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25675 1727203983.85586: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.85589: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25675 1727203983.85591: when evaluation is False, skipping this task 25675 1727203983.85593: _execute() done 25675 1727203983.85597: dumping result to json 25675 1727203983.85599: done dumping result, returning 25675 1727203983.85609: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [028d2410-947f-41bd-b19d-0000000000b0] 25675 1727203983.85618: sending task result for task 028d2410-947f-41bd-b19d-0000000000b0 25675 1727203983.85724: done sending task result for task 028d2410-947f-41bd-b19d-0000000000b0 25675 1727203983.85730: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25675 1727203983.85810: no more pending results, returning what we have 25675 1727203983.85814: results queue empty 25675 1727203983.85814: checking for any_errors_fatal 25675 1727203983.85818: done checking for any_errors_fatal 25675 1727203983.85819: checking for max_fail_percentage 25675 1727203983.85820: done checking for max_fail_percentage 25675 1727203983.85821: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.85822: done checking to see if all hosts have failed 25675 1727203983.85823: getting the remaining hosts for this loop 25675 1727203983.85824: done getting the remaining hosts for this loop 25675 1727203983.85827: getting the next task for host managed-node2 25675 1727203983.85835: done getting next task for host managed-node2 25675 1727203983.85837: ^ task is: TASK: Enable EPEL 6 25675 1727203983.85840: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.85844: getting variables 25675 1727203983.85846: in VariableManager get_vars() 25675 1727203983.85891: Calling all_inventory to load vars for managed-node2 25675 1727203983.85894: Calling groups_inventory to load vars for managed-node2 25675 1727203983.85897: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.85908: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.85911: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.85913: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.86281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.86594: done with get_vars() 25675 1727203983.86607: done getting variables 25675 1727203983.86680: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.022) 0:00:03.318 ***** 25675 1727203983.86716: entering _queue_task() for managed-node2/copy 25675 1727203983.87122: worker is 1 (out of 1 available) 25675 1727203983.87133: exiting _queue_task() for managed-node2/copy 25675 1727203983.87144: done queuing things up, now waiting for results queue to drain 25675 1727203983.87146: waiting for pending results... 25675 1727203983.87803: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 25675 1727203983.88031: in run() - task 028d2410-947f-41bd-b19d-0000000000b2 25675 1727203983.88035: variable 'ansible_search_path' from source: unknown 25675 1727203983.88037: variable 'ansible_search_path' from source: unknown 25675 1727203983.88248: calling self._execute() 25675 1727203983.88251: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.88254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.88256: variable 'omit' from source: magic vars 25675 1727203983.88824: variable 'ansible_distribution' from source: facts 25675 1727203983.88843: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25675 1727203983.88964: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.88982: Evaluated conditional (ansible_distribution_major_version == '6'): False 25675 1727203983.88992: when evaluation is False, skipping this task 25675 1727203983.89000: _execute() done 25675 1727203983.89012: dumping result to json 25675 1727203983.89021: done dumping result, returning 25675 1727203983.89033: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [028d2410-947f-41bd-b19d-0000000000b2] 25675 1727203983.89044: sending task result for task 028d2410-947f-41bd-b19d-0000000000b2 25675 1727203983.89329: done sending task result for task 028d2410-947f-41bd-b19d-0000000000b2 25675 1727203983.89332: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25675 1727203983.89386: no more pending results, returning what we have 25675 1727203983.89390: results queue empty 25675 1727203983.89391: checking for any_errors_fatal 25675 1727203983.89395: done checking for any_errors_fatal 25675 1727203983.89396: checking for max_fail_percentage 25675 1727203983.89398: done checking for max_fail_percentage 25675 1727203983.89399: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.89400: done checking to see if all hosts have failed 25675 1727203983.89401: getting the remaining hosts for this loop 25675 1727203983.89402: done getting the remaining hosts for this loop 25675 1727203983.89406: getting the next task for host managed-node2 25675 1727203983.89415: done getting next task for host managed-node2 25675 1727203983.89418: ^ task is: TASK: Set network provider to 'nm' 25675 1727203983.89421: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.89425: getting variables 25675 1727203983.89427: in VariableManager get_vars() 25675 1727203983.89459: Calling all_inventory to load vars for managed-node2 25675 1727203983.89463: Calling groups_inventory to load vars for managed-node2 25675 1727203983.89467: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.89484: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.89488: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.89491: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.90022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.90183: done with get_vars() 25675 1727203983.90193: done getting variables 25675 1727203983.90250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.035) 0:00:03.354 ***** 25675 1727203983.90281: entering _queue_task() for managed-node2/set_fact 25675 1727203983.90602: worker is 1 (out of 1 available) 25675 1727203983.90612: exiting _queue_task() for managed-node2/set_fact 25675 1727203983.90622: done queuing things up, now waiting for results queue to drain 25675 1727203983.90623: waiting for pending results... 25675 1727203983.90935: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 25675 1727203983.91035: in run() - task 028d2410-947f-41bd-b19d-000000000007 25675 1727203983.91384: variable 'ansible_search_path' from source: unknown 25675 1727203983.91388: calling self._execute() 25675 1727203983.91423: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.91435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.91450: variable 'omit' from source: magic vars 25675 1727203983.91774: variable 'omit' from source: magic vars 25675 1727203983.91813: variable 'omit' from source: magic vars 25675 1727203983.91854: variable 'omit' from source: magic vars 25675 1727203983.91908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203983.92031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203983.92057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203983.92118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.92137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.92238: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203983.92249: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.92258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.92446: Set connection var ansible_shell_type to sh 25675 1727203983.92462: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203983.92480: Set connection var ansible_timeout to 10 25675 1727203983.92494: Set connection var ansible_pipelining to False 25675 1727203983.92507: Set connection var ansible_shell_executable to /bin/sh 25675 1727203983.92514: Set connection var ansible_connection to ssh 25675 1727203983.92558: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.92567: variable 'ansible_connection' from source: unknown 25675 1727203983.92580: variable 'ansible_module_compression' from source: unknown 25675 1727203983.92588: variable 'ansible_shell_type' from source: unknown 25675 1727203983.92594: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.92600: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.92608: variable 'ansible_pipelining' from source: unknown 25675 1727203983.92614: variable 'ansible_timeout' from source: unknown 25675 1727203983.92621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.92799: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203983.92855: variable 'omit' from source: magic vars 25675 1727203983.92858: starting attempt loop 25675 1727203983.92860: running the handler 25675 1727203983.92863: handler run complete 25675 1727203983.92866: attempt loop complete, returning result 25675 1727203983.92868: _execute() done 25675 1727203983.92880: dumping result to json 25675 1727203983.92889: done dumping result, returning 25675 1727203983.92965: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [028d2410-947f-41bd-b19d-000000000007] 25675 1727203983.92968: sending task result for task 028d2410-947f-41bd-b19d-000000000007 25675 1727203983.93045: done sending task result for task 028d2410-947f-41bd-b19d-000000000007 25675 1727203983.93049: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 25675 1727203983.93133: no more pending results, returning what we have 25675 1727203983.93136: results queue empty 25675 1727203983.93137: checking for any_errors_fatal 25675 1727203983.93147: done checking for any_errors_fatal 25675 1727203983.93147: checking for max_fail_percentage 25675 1727203983.93150: done checking for max_fail_percentage 25675 1727203983.93150: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.93151: done checking to see if all hosts have failed 25675 1727203983.93152: getting the remaining hosts for this loop 25675 1727203983.93154: done getting the remaining hosts for this loop 25675 1727203983.93158: getting the next task for host managed-node2 25675 1727203983.93166: done getting next task for host managed-node2 25675 1727203983.93171: ^ task is: TASK: meta (flush_handlers) 25675 1727203983.93173: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.93181: getting variables 25675 1727203983.93183: in VariableManager get_vars() 25675 1727203983.93218: Calling all_inventory to load vars for managed-node2 25675 1727203983.93222: Calling groups_inventory to load vars for managed-node2 25675 1727203983.93225: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.93239: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.93242: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.93245: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.93627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.93896: done with get_vars() 25675 1727203983.93908: done getting variables 25675 1727203983.93981: in VariableManager get_vars() 25675 1727203983.93992: Calling all_inventory to load vars for managed-node2 25675 1727203983.93994: Calling groups_inventory to load vars for managed-node2 25675 1727203983.93996: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.94001: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.94003: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.94005: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.94171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.94358: done with get_vars() 25675 1727203983.94377: done queuing things up, now waiting for results queue to drain 25675 1727203983.94380: results queue empty 25675 1727203983.94380: checking for any_errors_fatal 25675 1727203983.94383: done checking for any_errors_fatal 25675 1727203983.94384: checking for max_fail_percentage 25675 1727203983.94385: done checking for max_fail_percentage 25675 1727203983.94386: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.94387: done checking to see if all hosts have failed 25675 1727203983.94388: getting the remaining hosts for this loop 25675 1727203983.94389: done getting the remaining hosts for this loop 25675 1727203983.94391: getting the next task for host managed-node2 25675 1727203983.94396: done getting next task for host managed-node2 25675 1727203983.94397: ^ task is: TASK: meta (flush_handlers) 25675 1727203983.94398: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.94406: getting variables 25675 1727203983.94407: in VariableManager get_vars() 25675 1727203983.94416: Calling all_inventory to load vars for managed-node2 25675 1727203983.94417: Calling groups_inventory to load vars for managed-node2 25675 1727203983.94420: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.94424: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.94426: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.94429: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.94552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.94724: done with get_vars() 25675 1727203983.94733: done getting variables 25675 1727203983.94785: in VariableManager get_vars() 25675 1727203983.94793: Calling all_inventory to load vars for managed-node2 25675 1727203983.94795: Calling groups_inventory to load vars for managed-node2 25675 1727203983.94796: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.94800: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.94802: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.94804: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.94921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.95125: done with get_vars() 25675 1727203983.95135: done queuing things up, now waiting for results queue to drain 25675 1727203983.95137: results queue empty 25675 1727203983.95137: checking for any_errors_fatal 25675 1727203983.95138: done checking for any_errors_fatal 25675 1727203983.95139: checking for max_fail_percentage 25675 1727203983.95140: done checking for max_fail_percentage 25675 1727203983.95140: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.95141: done checking to see if all hosts have failed 25675 1727203983.95142: getting the remaining hosts for this loop 25675 1727203983.95143: done getting the remaining hosts for this loop 25675 1727203983.95145: getting the next task for host managed-node2 25675 1727203983.95148: done getting next task for host managed-node2 25675 1727203983.95148: ^ task is: None 25675 1727203983.95150: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.95151: done queuing things up, now waiting for results queue to drain 25675 1727203983.95151: results queue empty 25675 1727203983.95152: checking for any_errors_fatal 25675 1727203983.95152: done checking for any_errors_fatal 25675 1727203983.95153: checking for max_fail_percentage 25675 1727203983.95154: done checking for max_fail_percentage 25675 1727203983.95154: checking to see if all hosts have failed and the running result is not ok 25675 1727203983.95155: done checking to see if all hosts have failed 25675 1727203983.95157: getting the next task for host managed-node2 25675 1727203983.95158: done getting next task for host managed-node2 25675 1727203983.95159: ^ task is: None 25675 1727203983.95160: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.95206: in VariableManager get_vars() 25675 1727203983.95220: done with get_vars() 25675 1727203983.95225: in VariableManager get_vars() 25675 1727203983.95234: done with get_vars() 25675 1727203983.95238: variable 'omit' from source: magic vars 25675 1727203983.95272: in VariableManager get_vars() 25675 1727203983.95292: done with get_vars() 25675 1727203983.95313: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 25675 1727203983.95499: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727203983.95525: getting the remaining hosts for this loop 25675 1727203983.95527: done getting the remaining hosts for this loop 25675 1727203983.95529: getting the next task for host managed-node2 25675 1727203983.95532: done getting next task for host managed-node2 25675 1727203983.95534: ^ task is: TASK: Gathering Facts 25675 1727203983.95535: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203983.95537: getting variables 25675 1727203983.95538: in VariableManager get_vars() 25675 1727203983.95546: Calling all_inventory to load vars for managed-node2 25675 1727203983.95548: Calling groups_inventory to load vars for managed-node2 25675 1727203983.95550: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203983.95554: Calling all_plugins_play to load vars for managed-node2 25675 1727203983.95567: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203983.95572: Calling groups_plugins_play to load vars for managed-node2 25675 1727203983.95710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203983.95901: done with get_vars() 25675 1727203983.95910: done getting variables 25675 1727203983.95950: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.056) 0:00:03.410 ***** 25675 1727203983.95978: entering _queue_task() for managed-node2/gather_facts 25675 1727203983.96309: worker is 1 (out of 1 available) 25675 1727203983.96320: exiting _queue_task() for managed-node2/gather_facts 25675 1727203983.96332: done queuing things up, now waiting for results queue to drain 25675 1727203983.96333: waiting for pending results... 25675 1727203983.96603: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727203983.96784: in run() - task 028d2410-947f-41bd-b19d-0000000000d8 25675 1727203983.96788: variable 'ansible_search_path' from source: unknown 25675 1727203983.96791: calling self._execute() 25675 1727203983.96857: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.96867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.96889: variable 'omit' from source: magic vars 25675 1727203983.97256: variable 'ansible_distribution_major_version' from source: facts 25675 1727203983.97277: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203983.97290: variable 'omit' from source: magic vars 25675 1727203983.97318: variable 'omit' from source: magic vars 25675 1727203983.97361: variable 'omit' from source: magic vars 25675 1727203983.97450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203983.97453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203983.97482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203983.97503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.97520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203983.97555: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203983.97567: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.97583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.97780: Set connection var ansible_shell_type to sh 25675 1727203983.97784: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203983.97787: Set connection var ansible_timeout to 10 25675 1727203983.97789: Set connection var ansible_pipelining to False 25675 1727203983.97790: Set connection var ansible_shell_executable to /bin/sh 25675 1727203983.97793: Set connection var ansible_connection to ssh 25675 1727203983.97794: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.97796: variable 'ansible_connection' from source: unknown 25675 1727203983.97798: variable 'ansible_module_compression' from source: unknown 25675 1727203983.97800: variable 'ansible_shell_type' from source: unknown 25675 1727203983.97801: variable 'ansible_shell_executable' from source: unknown 25675 1727203983.97803: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203983.97805: variable 'ansible_pipelining' from source: unknown 25675 1727203983.97806: variable 'ansible_timeout' from source: unknown 25675 1727203983.97808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203983.97988: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203983.98004: variable 'omit' from source: magic vars 25675 1727203983.98012: starting attempt loop 25675 1727203983.98018: running the handler 25675 1727203983.98041: variable 'ansible_facts' from source: unknown 25675 1727203983.98063: _low_level_execute_command(): starting 25675 1727203983.98080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203983.98898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203983.98908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203983.98928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203983.99054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203984.01455: stdout chunk (state=3): >>>/root <<< 25675 1727203984.01635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203984.01658: stderr chunk (state=3): >>><<< 25675 1727203984.01668: stdout chunk (state=3): >>><<< 25675 1727203984.01701: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203984.01803: _low_level_execute_command(): starting 25675 1727203984.01807: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719 `" && echo ansible-tmp-1727203984.0170882-25913-88739827612719="` echo /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719 `" ) && sleep 0' 25675 1727203984.02384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203984.02398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203984.02497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203984.02501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203984.02559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203984.02591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203984.02605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203984.02914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203984.05654: stdout chunk (state=3): >>>ansible-tmp-1727203984.0170882-25913-88739827612719=/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719 <<< 25675 1727203984.05884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203984.05887: stdout chunk (state=3): >>><<< 25675 1727203984.05890: stderr chunk (state=3): >>><<< 25675 1727203984.05893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203984.0170882-25913-88739827612719=/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203984.06192: variable 'ansible_module_compression' from source: unknown 25675 1727203984.06196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727203984.06199: variable 'ansible_facts' from source: unknown 25675 1727203984.06658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py 25675 1727203984.07343: Sending initial data 25675 1727203984.07356: Sent initial data (153 bytes) 25675 1727203984.08408: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203984.08448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203984.08467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203984.08587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203984.10840: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25675 1727203984.10859: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 25675 1727203984.10894: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203984.10966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203984.11197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpvfal0zdy /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py <<< 25675 1727203984.11200: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py" <<< 25675 1727203984.11367: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpvfal0zdy" to remote "/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py" <<< 25675 1727203984.13048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203984.13156: stderr chunk (state=3): >>><<< 25675 1727203984.13173: stdout chunk (state=3): >>><<< 25675 1727203984.13200: done transferring module to remote 25675 1727203984.13216: _low_level_execute_command(): starting 25675 1727203984.13228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/ /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py && sleep 0' 25675 1727203984.13840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203984.13854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203984.13870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203984.13892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203984.13908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203984.13923: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203984.13936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203984.14032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203984.14055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203984.14173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203984.16787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203984.16991: stderr chunk (state=3): >>><<< 25675 1727203984.16995: stdout chunk (state=3): >>><<< 25675 1727203984.16998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203984.17000: _low_level_execute_command(): starting 25675 1727203984.17003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/AnsiballZ_setup.py && sleep 0' 25675 1727203984.17854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203984.17896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203984.17912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203984.17933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203984.18058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203984.98735: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "04", "epoch": "1727203984", "epoch_int": "1727203984", "date": "2024-09-24", "time": "14:53:04", "iso8601_micro": "2024-09-24T18:53:04.602659Z", "iso8601": "2024-09-24T18:53:04Z", "iso8601_basic": "20240924T145304602659", "iso8601_basic_short": "20240924T145304", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.45751953125, "5m": 0.4169921875, "15m": 0.22021484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2922, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 609, "free": 2922}, "nocache": {"free": 3279, "used": 252}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU"<<< 25675 1727203984.98800: stdout chunk (state=3): >>>, "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 570, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785481216, "block_size": 4096, "block_total": 65519099, "block_available": 63912471, "block_used": 1606628, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727203985.01511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203985.01524: stderr chunk (state=3): >>><<< 25675 1727203985.01683: stdout chunk (state=3): >>><<< 25675 1727203985.01689: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "04", "epoch": "1727203984", "epoch_int": "1727203984", "date": "2024-09-24", "time": "14:53:04", "iso8601_micro": "2024-09-24T18:53:04.602659Z", "iso8601": "2024-09-24T18:53:04Z", "iso8601_basic": "20240924T145304602659", "iso8601_basic_short": "20240924T145304", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.45751953125, "5m": 0.4169921875, "15m": 0.22021484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2922, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 609, "free": 2922}, "nocache": {"free": 3279, "used": 252}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 570, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785481216, "block_size": 4096, "block_total": 65519099, "block_available": 63912471, "block_used": 1606628, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203985.01934: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203985.01962: _low_level_execute_command(): starting 25675 1727203985.01972: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203984.0170882-25913-88739827612719/ > /dev/null 2>&1 && sleep 0' 25675 1727203985.02639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203985.02652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203985.02700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203985.02803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203985.02824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.02936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203985.05540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203985.05582: stdout chunk (state=3): >>><<< 25675 1727203985.05585: stderr chunk (state=3): >>><<< 25675 1727203985.05606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203985.05621: handler run complete 25675 1727203985.05982: variable 'ansible_facts' from source: unknown 25675 1727203985.05986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.06182: variable 'ansible_facts' from source: unknown 25675 1727203985.06285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.06451: attempt loop complete, returning result 25675 1727203985.06461: _execute() done 25675 1727203985.06469: dumping result to json 25675 1727203985.06505: done dumping result, returning 25675 1727203985.06519: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-0000000000d8] 25675 1727203985.06529: sending task result for task 028d2410-947f-41bd-b19d-0000000000d8 ok: [managed-node2] 25675 1727203985.07512: no more pending results, returning what we have 25675 1727203985.07515: results queue empty 25675 1727203985.07516: checking for any_errors_fatal 25675 1727203985.07517: done checking for any_errors_fatal 25675 1727203985.07518: checking for max_fail_percentage 25675 1727203985.07519: done checking for max_fail_percentage 25675 1727203985.07520: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.07521: done checking to see if all hosts have failed 25675 1727203985.07522: getting the remaining hosts for this loop 25675 1727203985.07523: done getting the remaining hosts for this loop 25675 1727203985.07526: getting the next task for host managed-node2 25675 1727203985.07531: done getting next task for host managed-node2 25675 1727203985.07533: ^ task is: TASK: meta (flush_handlers) 25675 1727203985.07534: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.07538: getting variables 25675 1727203985.07539: in VariableManager get_vars() 25675 1727203985.07559: Calling all_inventory to load vars for managed-node2 25675 1727203985.07562: Calling groups_inventory to load vars for managed-node2 25675 1727203985.07565: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.07598: done sending task result for task 028d2410-947f-41bd-b19d-0000000000d8 25675 1727203985.07601: WORKER PROCESS EXITING 25675 1727203985.07611: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.07614: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.07617: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.07791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.07988: done with get_vars() 25675 1727203985.07998: done getting variables 25675 1727203985.08078: in VariableManager get_vars() 25675 1727203985.08088: Calling all_inventory to load vars for managed-node2 25675 1727203985.08090: Calling groups_inventory to load vars for managed-node2 25675 1727203985.08092: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.08097: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.08099: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.08101: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.08255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.08437: done with get_vars() 25675 1727203985.08481: done queuing things up, now waiting for results queue to drain 25675 1727203985.08483: results queue empty 25675 1727203985.08484: checking for any_errors_fatal 25675 1727203985.08487: done checking for any_errors_fatal 25675 1727203985.08488: checking for max_fail_percentage 25675 1727203985.08489: done checking for max_fail_percentage 25675 1727203985.08490: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.08496: done checking to see if all hosts have failed 25675 1727203985.08497: getting the remaining hosts for this loop 25675 1727203985.08498: done getting the remaining hosts for this loop 25675 1727203985.08500: getting the next task for host managed-node2 25675 1727203985.08504: done getting next task for host managed-node2 25675 1727203985.08506: ^ task is: TASK: Show inside ethernet tests 25675 1727203985.08507: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.08509: getting variables 25675 1727203985.08510: in VariableManager get_vars() 25675 1727203985.08518: Calling all_inventory to load vars for managed-node2 25675 1727203985.08520: Calling groups_inventory to load vars for managed-node2 25675 1727203985.08522: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.08527: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.08529: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.08532: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.08759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.08977: done with get_vars() 25675 1727203985.08996: done getting variables 25675 1727203985.09069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Tuesday 24 September 2024 14:53:05 -0400 (0:00:01.131) 0:00:04.542 ***** 25675 1727203985.09109: entering _queue_task() for managed-node2/debug 25675 1727203985.09111: Creating lock for debug 25675 1727203985.09487: worker is 1 (out of 1 available) 25675 1727203985.09498: exiting _queue_task() for managed-node2/debug 25675 1727203985.09508: done queuing things up, now waiting for results queue to drain 25675 1727203985.09509: waiting for pending results... 25675 1727203985.09728: running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests 25675 1727203985.09832: in run() - task 028d2410-947f-41bd-b19d-00000000000b 25675 1727203985.09853: variable 'ansible_search_path' from source: unknown 25675 1727203985.09900: calling self._execute() 25675 1727203985.09979: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.09994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.10010: variable 'omit' from source: magic vars 25675 1727203985.10413: variable 'ansible_distribution_major_version' from source: facts 25675 1727203985.10434: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203985.10447: variable 'omit' from source: magic vars 25675 1727203985.10488: variable 'omit' from source: magic vars 25675 1727203985.10545: variable 'omit' from source: magic vars 25675 1727203985.10591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203985.10644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203985.10670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203985.10695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.10717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.10759: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203985.10768: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.10777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.10899: Set connection var ansible_shell_type to sh 25675 1727203985.10912: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203985.10933: Set connection var ansible_timeout to 10 25675 1727203985.10958: Set connection var ansible_pipelining to False 25675 1727203985.10961: Set connection var ansible_shell_executable to /bin/sh 25675 1727203985.10982: Set connection var ansible_connection to ssh 25675 1727203985.11005: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.11042: variable 'ansible_connection' from source: unknown 25675 1727203985.11045: variable 'ansible_module_compression' from source: unknown 25675 1727203985.11048: variable 'ansible_shell_type' from source: unknown 25675 1727203985.11050: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.11052: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.11065: variable 'ansible_pipelining' from source: unknown 25675 1727203985.11068: variable 'ansible_timeout' from source: unknown 25675 1727203985.11070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.11261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203985.11265: variable 'omit' from source: magic vars 25675 1727203985.11267: starting attempt loop 25675 1727203985.11270: running the handler 25675 1727203985.11317: handler run complete 25675 1727203985.11348: attempt loop complete, returning result 25675 1727203985.11357: _execute() done 25675 1727203985.11371: dumping result to json 25675 1727203985.11395: done dumping result, returning 25675 1727203985.11505: done running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests [028d2410-947f-41bd-b19d-00000000000b] 25675 1727203985.11508: sending task result for task 028d2410-947f-41bd-b19d-00000000000b 25675 1727203985.11573: done sending task result for task 028d2410-947f-41bd-b19d-00000000000b 25675 1727203985.11578: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Inside ethernet tests 25675 1727203985.11637: no more pending results, returning what we have 25675 1727203985.11641: results queue empty 25675 1727203985.11642: checking for any_errors_fatal 25675 1727203985.11644: done checking for any_errors_fatal 25675 1727203985.11644: checking for max_fail_percentage 25675 1727203985.11646: done checking for max_fail_percentage 25675 1727203985.11647: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.11648: done checking to see if all hosts have failed 25675 1727203985.11648: getting the remaining hosts for this loop 25675 1727203985.11650: done getting the remaining hosts for this loop 25675 1727203985.11654: getting the next task for host managed-node2 25675 1727203985.11660: done getting next task for host managed-node2 25675 1727203985.11663: ^ task is: TASK: Show network_provider 25675 1727203985.11665: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.11668: getting variables 25675 1727203985.11670: in VariableManager get_vars() 25675 1727203985.11745: Calling all_inventory to load vars for managed-node2 25675 1727203985.11748: Calling groups_inventory to load vars for managed-node2 25675 1727203985.11752: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.11763: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.11766: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.11769: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.12237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.12588: done with get_vars() 25675 1727203985.12601: done getting variables 25675 1727203985.12665: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Tuesday 24 September 2024 14:53:05 -0400 (0:00:00.036) 0:00:04.578 ***** 25675 1727203985.12704: entering _queue_task() for managed-node2/debug 25675 1727203985.13127: worker is 1 (out of 1 available) 25675 1727203985.13137: exiting _queue_task() for managed-node2/debug 25675 1727203985.13147: done queuing things up, now waiting for results queue to drain 25675 1727203985.13148: waiting for pending results... 25675 1727203985.13355: running TaskExecutor() for managed-node2/TASK: Show network_provider 25675 1727203985.13451: in run() - task 028d2410-947f-41bd-b19d-00000000000c 25675 1727203985.13454: variable 'ansible_search_path' from source: unknown 25675 1727203985.13479: calling self._execute() 25675 1727203985.13577: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.13590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.13605: variable 'omit' from source: magic vars 25675 1727203985.14016: variable 'ansible_distribution_major_version' from source: facts 25675 1727203985.14036: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203985.14049: variable 'omit' from source: magic vars 25675 1727203985.14085: variable 'omit' from source: magic vars 25675 1727203985.14141: variable 'omit' from source: magic vars 25675 1727203985.14189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203985.14239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203985.14280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203985.14288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.14303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.14346: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203985.14432: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.14435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.14474: Set connection var ansible_shell_type to sh 25675 1727203985.14489: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203985.14502: Set connection var ansible_timeout to 10 25675 1727203985.14513: Set connection var ansible_pipelining to False 25675 1727203985.14523: Set connection var ansible_shell_executable to /bin/sh 25675 1727203985.14541: Set connection var ansible_connection to ssh 25675 1727203985.14573: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.14585: variable 'ansible_connection' from source: unknown 25675 1727203985.14592: variable 'ansible_module_compression' from source: unknown 25675 1727203985.14599: variable 'ansible_shell_type' from source: unknown 25675 1727203985.14605: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.14611: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.14619: variable 'ansible_pipelining' from source: unknown 25675 1727203985.14626: variable 'ansible_timeout' from source: unknown 25675 1727203985.14650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.14979: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203985.14983: variable 'omit' from source: magic vars 25675 1727203985.14986: starting attempt loop 25675 1727203985.14988: running the handler 25675 1727203985.14990: variable 'network_provider' from source: set_fact 25675 1727203985.15054: variable 'network_provider' from source: set_fact 25675 1727203985.15199: handler run complete 25675 1727203985.15283: attempt loop complete, returning result 25675 1727203985.15286: _execute() done 25675 1727203985.15289: dumping result to json 25675 1727203985.15291: done dumping result, returning 25675 1727203985.15295: done running TaskExecutor() for managed-node2/TASK: Show network_provider [028d2410-947f-41bd-b19d-00000000000c] 25675 1727203985.15297: sending task result for task 028d2410-947f-41bd-b19d-00000000000c 25675 1727203985.15366: done sending task result for task 028d2410-947f-41bd-b19d-00000000000c 25675 1727203985.15369: WORKER PROCESS EXITING ok: [managed-node2] => { "network_provider": "nm" } 25675 1727203985.15422: no more pending results, returning what we have 25675 1727203985.15426: results queue empty 25675 1727203985.15427: checking for any_errors_fatal 25675 1727203985.15434: done checking for any_errors_fatal 25675 1727203985.15435: checking for max_fail_percentage 25675 1727203985.15436: done checking for max_fail_percentage 25675 1727203985.15437: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.15438: done checking to see if all hosts have failed 25675 1727203985.15439: getting the remaining hosts for this loop 25675 1727203985.15440: done getting the remaining hosts for this loop 25675 1727203985.15444: getting the next task for host managed-node2 25675 1727203985.15451: done getting next task for host managed-node2 25675 1727203985.15453: ^ task is: TASK: meta (flush_handlers) 25675 1727203985.15455: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.15460: getting variables 25675 1727203985.15462: in VariableManager get_vars() 25675 1727203985.15615: Calling all_inventory to load vars for managed-node2 25675 1727203985.15618: Calling groups_inventory to load vars for managed-node2 25675 1727203985.15622: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.15633: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.15636: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.15639: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.16250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.16459: done with get_vars() 25675 1727203985.16469: done getting variables 25675 1727203985.16549: in VariableManager get_vars() 25675 1727203985.16559: Calling all_inventory to load vars for managed-node2 25675 1727203985.16561: Calling groups_inventory to load vars for managed-node2 25675 1727203985.16563: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.16568: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.16570: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.16573: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.16727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.16923: done with get_vars() 25675 1727203985.16941: done queuing things up, now waiting for results queue to drain 25675 1727203985.16943: results queue empty 25675 1727203985.16943: checking for any_errors_fatal 25675 1727203985.16946: done checking for any_errors_fatal 25675 1727203985.16947: checking for max_fail_percentage 25675 1727203985.16948: done checking for max_fail_percentage 25675 1727203985.16948: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.16949: done checking to see if all hosts have failed 25675 1727203985.16950: getting the remaining hosts for this loop 25675 1727203985.16951: done getting the remaining hosts for this loop 25675 1727203985.16953: getting the next task for host managed-node2 25675 1727203985.16967: done getting next task for host managed-node2 25675 1727203985.16969: ^ task is: TASK: meta (flush_handlers) 25675 1727203985.16970: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.16973: getting variables 25675 1727203985.16974: in VariableManager get_vars() 25675 1727203985.16984: Calling all_inventory to load vars for managed-node2 25675 1727203985.16986: Calling groups_inventory to load vars for managed-node2 25675 1727203985.16989: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.16993: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.16996: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.16998: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.17170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.17362: done with get_vars() 25675 1727203985.17377: done getting variables 25675 1727203985.17433: in VariableManager get_vars() 25675 1727203985.17442: Calling all_inventory to load vars for managed-node2 25675 1727203985.17444: Calling groups_inventory to load vars for managed-node2 25675 1727203985.17446: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.17451: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.17453: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.17456: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.17603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.17792: done with get_vars() 25675 1727203985.17809: done queuing things up, now waiting for results queue to drain 25675 1727203985.17811: results queue empty 25675 1727203985.17812: checking for any_errors_fatal 25675 1727203985.17813: done checking for any_errors_fatal 25675 1727203985.17813: checking for max_fail_percentage 25675 1727203985.17814: done checking for max_fail_percentage 25675 1727203985.17815: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.17816: done checking to see if all hosts have failed 25675 1727203985.17816: getting the remaining hosts for this loop 25675 1727203985.17817: done getting the remaining hosts for this loop 25675 1727203985.17820: getting the next task for host managed-node2 25675 1727203985.17823: done getting next task for host managed-node2 25675 1727203985.17824: ^ task is: None 25675 1727203985.17826: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.17827: done queuing things up, now waiting for results queue to drain 25675 1727203985.17828: results queue empty 25675 1727203985.17828: checking for any_errors_fatal 25675 1727203985.17829: done checking for any_errors_fatal 25675 1727203985.17830: checking for max_fail_percentage 25675 1727203985.17831: done checking for max_fail_percentage 25675 1727203985.17831: checking to see if all hosts have failed and the running result is not ok 25675 1727203985.17832: done checking to see if all hosts have failed 25675 1727203985.17838: getting the next task for host managed-node2 25675 1727203985.17842: done getting next task for host managed-node2 25675 1727203985.17843: ^ task is: None 25675 1727203985.17844: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.17885: in VariableManager get_vars() 25675 1727203985.17901: done with get_vars() 25675 1727203985.17907: in VariableManager get_vars() 25675 1727203985.17923: done with get_vars() 25675 1727203985.17929: variable 'omit' from source: magic vars 25675 1727203985.17974: in VariableManager get_vars() 25675 1727203985.17986: done with get_vars() 25675 1727203985.18006: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 25675 1727203985.18199: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727203985.18357: getting the remaining hosts for this loop 25675 1727203985.18359: done getting the remaining hosts for this loop 25675 1727203985.18362: getting the next task for host managed-node2 25675 1727203985.18365: done getting next task for host managed-node2 25675 1727203985.18366: ^ task is: TASK: Gathering Facts 25675 1727203985.18368: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203985.18370: getting variables 25675 1727203985.18371: in VariableManager get_vars() 25675 1727203985.18381: Calling all_inventory to load vars for managed-node2 25675 1727203985.18384: Calling groups_inventory to load vars for managed-node2 25675 1727203985.18386: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203985.18391: Calling all_plugins_play to load vars for managed-node2 25675 1727203985.18394: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203985.18397: Calling groups_plugins_play to load vars for managed-node2 25675 1727203985.18697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203985.18897: done with get_vars() 25675 1727203985.18905: done getting variables 25675 1727203985.18948: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Tuesday 24 September 2024 14:53:05 -0400 (0:00:00.062) 0:00:04.640 ***** 25675 1727203985.18978: entering _queue_task() for managed-node2/gather_facts 25675 1727203985.19410: worker is 1 (out of 1 available) 25675 1727203985.19420: exiting _queue_task() for managed-node2/gather_facts 25675 1727203985.19429: done queuing things up, now waiting for results queue to drain 25675 1727203985.19430: waiting for pending results... 25675 1727203985.19585: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727203985.19695: in run() - task 028d2410-947f-41bd-b19d-0000000000f0 25675 1727203985.19714: variable 'ansible_search_path' from source: unknown 25675 1727203985.19766: calling self._execute() 25675 1727203985.19877: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.19881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.19883: variable 'omit' from source: magic vars 25675 1727203985.20269: variable 'ansible_distribution_major_version' from source: facts 25675 1727203985.20291: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203985.20382: variable 'omit' from source: magic vars 25675 1727203985.20386: variable 'omit' from source: magic vars 25675 1727203985.20389: variable 'omit' from source: magic vars 25675 1727203985.20425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203985.20464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203985.20494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203985.20515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.20540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203985.20572: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203985.20582: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.20590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.20707: Set connection var ansible_shell_type to sh 25675 1727203985.20720: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203985.20729: Set connection var ansible_timeout to 10 25675 1727203985.20745: Set connection var ansible_pipelining to False 25675 1727203985.20755: Set connection var ansible_shell_executable to /bin/sh 25675 1727203985.20815: Set connection var ansible_connection to ssh 25675 1727203985.20818: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.20820: variable 'ansible_connection' from source: unknown 25675 1727203985.20822: variable 'ansible_module_compression' from source: unknown 25675 1727203985.20824: variable 'ansible_shell_type' from source: unknown 25675 1727203985.20826: variable 'ansible_shell_executable' from source: unknown 25675 1727203985.20828: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203985.20830: variable 'ansible_pipelining' from source: unknown 25675 1727203985.20832: variable 'ansible_timeout' from source: unknown 25675 1727203985.20834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203985.21032: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203985.21048: variable 'omit' from source: magic vars 25675 1727203985.21058: starting attempt loop 25675 1727203985.21073: running the handler 25675 1727203985.21095: variable 'ansible_facts' from source: unknown 25675 1727203985.21140: _low_level_execute_command(): starting 25675 1727203985.21143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203985.21848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203985.21953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203985.21957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.22007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203985.22051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.22121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203985.24516: stdout chunk (state=3): >>>/root <<< 25675 1727203985.24708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203985.24737: stderr chunk (state=3): >>><<< 25675 1727203985.24760: stdout chunk (state=3): >>><<< 25675 1727203985.24785: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203985.24801: _low_level_execute_command(): starting 25675 1727203985.24823: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294 `" && echo ansible-tmp-1727203985.247847-25962-89994941802294="` echo /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294 `" ) && sleep 0' 25675 1727203985.25284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203985.25287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.25290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203985.25303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.25340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203985.25344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.25473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203985.28242: stdout chunk (state=3): >>>ansible-tmp-1727203985.247847-25962-89994941802294=/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294 <<< 25675 1727203985.28397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203985.28424: stderr chunk (state=3): >>><<< 25675 1727203985.28427: stdout chunk (state=3): >>><<< 25675 1727203985.28443: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203985.247847-25962-89994941802294=/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203985.28468: variable 'ansible_module_compression' from source: unknown 25675 1727203985.28518: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727203985.28566: variable 'ansible_facts' from source: unknown 25675 1727203985.28709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py 25675 1727203985.28811: Sending initial data 25675 1727203985.28815: Sent initial data (152 bytes) 25675 1727203985.29289: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203985.29293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.29295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203985.29298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203985.29300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.29347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203985.29350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203985.29352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.29434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203985.31734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203985.31829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203985.31930: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpmuv9791a /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py <<< 25675 1727203985.31933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py" <<< 25675 1727203985.32022: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpmuv9791a" to remote "/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py" <<< 25675 1727203985.33226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203985.33278: stderr chunk (state=3): >>><<< 25675 1727203985.33282: stdout chunk (state=3): >>><<< 25675 1727203985.33299: done transferring module to remote 25675 1727203985.33308: _low_level_execute_command(): starting 25675 1727203985.33313: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/ /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py && sleep 0' 25675 1727203985.33773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203985.33779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.33781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203985.33783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203985.33785: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.33835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203985.33839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.33926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203985.36530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203985.36558: stderr chunk (state=3): >>><<< 25675 1727203985.36561: stdout chunk (state=3): >>><<< 25675 1727203985.36579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25675 1727203985.36583: _low_level_execute_command(): starting 25675 1727203985.36585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/AnsiballZ_setup.py && sleep 0' 25675 1727203985.37048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203985.37051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203985.37053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.37055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727203985.37057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203985.37060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203985.37117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203985.37120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203985.37122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203985.37209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25675 1727203986.21652: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.45751953125, "5m": 0.4169921875, "15m": 0.22021484375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "05", "epoch": "1727203985", "epoch_int": "1727203985", "date": "2024-09-24", "time": "14:53:05", "iso8601_micro": "2024-09-24T18:53:05.758384Z", "iso8601": "2024-09-24T18:53:05Z", "iso8601_basic": "20240924T145305758384", "iso8601_basic_short": "20240924T145305", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2931, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 600, "free": 2931}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 572, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785567232, "block_size": 4096, "block_total": 65519099, "block_available": 63912492, "block_used": 1606607, "inode_total": 131070960, "inode_available": 131027262, "inode_used": 43698, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727203986.23542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.23627: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 25675 1727203986.23671: stderr chunk (state=3): >>><<< 25675 1727203986.23784: stdout chunk (state=3): >>><<< 25675 1727203986.23789: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.45751953125, "5m": 0.4169921875, "15m": 0.22021484375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "05", "epoch": "1727203985", "epoch_int": "1727203985", "date": "2024-09-24", "time": "14:53:05", "iso8601_micro": "2024-09-24T18:53:05.758384Z", "iso8601": "2024-09-24T18:53:05Z", "iso8601_basic": "20240924T145305758384", "iso8601_basic_short": "20240924T145305", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2931, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 600, "free": 2931}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 572, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785567232, "block_size": 4096, "block_total": 65519099, "block_available": 63912492, "block_used": 1606607, "inode_total": 131070960, "inode_available": 131027262, "inode_used": 43698, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203986.24289: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203986.24293: _low_level_execute_command(): starting 25675 1727203986.24295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203985.247847-25962-89994941802294/ > /dev/null 2>&1 && sleep 0' 25675 1727203986.26014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.26062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203986.26140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.26255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.28202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.28221: stdout chunk (state=3): >>><<< 25675 1727203986.28237: stderr chunk (state=3): >>><<< 25675 1727203986.28259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203986.28382: handler run complete 25675 1727203986.28410: variable 'ansible_facts' from source: unknown 25675 1727203986.28523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.28881: variable 'ansible_facts' from source: unknown 25675 1727203986.28987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.29116: attempt loop complete, returning result 25675 1727203986.29119: _execute() done 25675 1727203986.29122: dumping result to json 25675 1727203986.29148: done dumping result, returning 25675 1727203986.29155: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-0000000000f0] 25675 1727203986.29159: sending task result for task 028d2410-947f-41bd-b19d-0000000000f0 ok: [managed-node2] 25675 1727203986.30145: no more pending results, returning what we have 25675 1727203986.30148: results queue empty 25675 1727203986.30149: checking for any_errors_fatal 25675 1727203986.30151: done checking for any_errors_fatal 25675 1727203986.30152: checking for max_fail_percentage 25675 1727203986.30153: done checking for max_fail_percentage 25675 1727203986.30154: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.30155: done checking to see if all hosts have failed 25675 1727203986.30156: getting the remaining hosts for this loop 25675 1727203986.30157: done getting the remaining hosts for this loop 25675 1727203986.30161: getting the next task for host managed-node2 25675 1727203986.30166: done getting next task for host managed-node2 25675 1727203986.30168: ^ task is: TASK: meta (flush_handlers) 25675 1727203986.30173: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.30181: getting variables 25675 1727203986.30183: in VariableManager get_vars() 25675 1727203986.30204: Calling all_inventory to load vars for managed-node2 25675 1727203986.30207: Calling groups_inventory to load vars for managed-node2 25675 1727203986.30210: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.30216: done sending task result for task 028d2410-947f-41bd-b19d-0000000000f0 25675 1727203986.30219: WORKER PROCESS EXITING 25675 1727203986.30228: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.30237: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.30241: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.30406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.30701: done with get_vars() 25675 1727203986.30711: done getting variables 25675 1727203986.30807: in VariableManager get_vars() 25675 1727203986.30817: Calling all_inventory to load vars for managed-node2 25675 1727203986.30823: Calling groups_inventory to load vars for managed-node2 25675 1727203986.30824: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.30829: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.30832: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.30835: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.30940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.31046: done with get_vars() 25675 1727203986.31055: done queuing things up, now waiting for results queue to drain 25675 1727203986.31056: results queue empty 25675 1727203986.31057: checking for any_errors_fatal 25675 1727203986.31058: done checking for any_errors_fatal 25675 1727203986.31059: checking for max_fail_percentage 25675 1727203986.31060: done checking for max_fail_percentage 25675 1727203986.31060: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.31061: done checking to see if all hosts have failed 25675 1727203986.31065: getting the remaining hosts for this loop 25675 1727203986.31066: done getting the remaining hosts for this loop 25675 1727203986.31067: getting the next task for host managed-node2 25675 1727203986.31070: done getting next task for host managed-node2 25675 1727203986.31072: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 25675 1727203986.31073: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.31074: getting variables 25675 1727203986.31075: in VariableManager get_vars() 25675 1727203986.31082: Calling all_inventory to load vars for managed-node2 25675 1727203986.31083: Calling groups_inventory to load vars for managed-node2 25675 1727203986.31085: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.31088: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.31089: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.31091: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.31168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.31272: done with get_vars() 25675 1727203986.31279: done getting variables 25675 1727203986.31306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203986.31407: variable 'type' from source: play vars 25675 1727203986.31411: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Tuesday 24 September 2024 14:53:06 -0400 (0:00:01.124) 0:00:05.765 ***** 25675 1727203986.31441: entering _queue_task() for managed-node2/set_fact 25675 1727203986.31656: worker is 1 (out of 1 available) 25675 1727203986.31668: exiting _queue_task() for managed-node2/set_fact 25675 1727203986.31681: done queuing things up, now waiting for results queue to drain 25675 1727203986.31683: waiting for pending results... 25675 1727203986.31829: running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 25675 1727203986.31886: in run() - task 028d2410-947f-41bd-b19d-00000000000f 25675 1727203986.31897: variable 'ansible_search_path' from source: unknown 25675 1727203986.31928: calling self._execute() 25675 1727203986.31985: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.31988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.31999: variable 'omit' from source: magic vars 25675 1727203986.32252: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.32262: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.32268: variable 'omit' from source: magic vars 25675 1727203986.32295: variable 'omit' from source: magic vars 25675 1727203986.32316: variable 'type' from source: play vars 25675 1727203986.32370: variable 'type' from source: play vars 25675 1727203986.32382: variable 'interface' from source: play vars 25675 1727203986.32425: variable 'interface' from source: play vars 25675 1727203986.32437: variable 'omit' from source: magic vars 25675 1727203986.32469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203986.32499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203986.32537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203986.32715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.32718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.32721: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203986.32723: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.32725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.32980: Set connection var ansible_shell_type to sh 25675 1727203986.32984: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203986.32986: Set connection var ansible_timeout to 10 25675 1727203986.32988: Set connection var ansible_pipelining to False 25675 1727203986.32990: Set connection var ansible_shell_executable to /bin/sh 25675 1727203986.32992: Set connection var ansible_connection to ssh 25675 1727203986.32994: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.32997: variable 'ansible_connection' from source: unknown 25675 1727203986.32999: variable 'ansible_module_compression' from source: unknown 25675 1727203986.33001: variable 'ansible_shell_type' from source: unknown 25675 1727203986.33002: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.33004: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.33006: variable 'ansible_pipelining' from source: unknown 25675 1727203986.33008: variable 'ansible_timeout' from source: unknown 25675 1727203986.33010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.33041: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203986.33054: variable 'omit' from source: magic vars 25675 1727203986.33065: starting attempt loop 25675 1727203986.33078: running the handler 25675 1727203986.33094: handler run complete 25675 1727203986.33107: attempt loop complete, returning result 25675 1727203986.33113: _execute() done 25675 1727203986.33123: dumping result to json 25675 1727203986.33130: done dumping result, returning 25675 1727203986.33233: done running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 [028d2410-947f-41bd-b19d-00000000000f] 25675 1727203986.33237: sending task result for task 028d2410-947f-41bd-b19d-00000000000f 25675 1727203986.33308: done sending task result for task 028d2410-947f-41bd-b19d-00000000000f 25675 1727203986.33312: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 25675 1727203986.33396: no more pending results, returning what we have 25675 1727203986.33399: results queue empty 25675 1727203986.33400: checking for any_errors_fatal 25675 1727203986.33401: done checking for any_errors_fatal 25675 1727203986.33402: checking for max_fail_percentage 25675 1727203986.33404: done checking for max_fail_percentage 25675 1727203986.33405: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.33405: done checking to see if all hosts have failed 25675 1727203986.33406: getting the remaining hosts for this loop 25675 1727203986.33407: done getting the remaining hosts for this loop 25675 1727203986.33411: getting the next task for host managed-node2 25675 1727203986.33416: done getting next task for host managed-node2 25675 1727203986.33419: ^ task is: TASK: Include the task 'show_interfaces.yml' 25675 1727203986.33420: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.33424: getting variables 25675 1727203986.33427: in VariableManager get_vars() 25675 1727203986.33455: Calling all_inventory to load vars for managed-node2 25675 1727203986.33459: Calling groups_inventory to load vars for managed-node2 25675 1727203986.33463: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.33480: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.33484: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.33487: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.33762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.33874: done with get_vars() 25675 1727203986.33882: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.024) 0:00:05.790 ***** 25675 1727203986.33938: entering _queue_task() for managed-node2/include_tasks 25675 1727203986.34128: worker is 1 (out of 1 available) 25675 1727203986.34139: exiting _queue_task() for managed-node2/include_tasks 25675 1727203986.34151: done queuing things up, now waiting for results queue to drain 25675 1727203986.34152: waiting for pending results... 25675 1727203986.34300: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 25675 1727203986.34357: in run() - task 028d2410-947f-41bd-b19d-000000000010 25675 1727203986.34368: variable 'ansible_search_path' from source: unknown 25675 1727203986.34399: calling self._execute() 25675 1727203986.34456: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.34460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.34472: variable 'omit' from source: magic vars 25675 1727203986.34728: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.34738: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.34745: _execute() done 25675 1727203986.34748: dumping result to json 25675 1727203986.34750: done dumping result, returning 25675 1727203986.34755: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-41bd-b19d-000000000010] 25675 1727203986.34761: sending task result for task 028d2410-947f-41bd-b19d-000000000010 25675 1727203986.34842: done sending task result for task 028d2410-947f-41bd-b19d-000000000010 25675 1727203986.34845: WORKER PROCESS EXITING 25675 1727203986.34874: no more pending results, returning what we have 25675 1727203986.34880: in VariableManager get_vars() 25675 1727203986.34908: Calling all_inventory to load vars for managed-node2 25675 1727203986.34911: Calling groups_inventory to load vars for managed-node2 25675 1727203986.34914: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.34922: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.34924: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.34926: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.35047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.35156: done with get_vars() 25675 1727203986.35161: variable 'ansible_search_path' from source: unknown 25675 1727203986.35172: we have included files to process 25675 1727203986.35173: generating all_blocks data 25675 1727203986.35173: done generating all_blocks data 25675 1727203986.35174: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203986.35176: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203986.35178: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203986.35281: in VariableManager get_vars() 25675 1727203986.35291: done with get_vars() 25675 1727203986.35360: done processing included file 25675 1727203986.35361: iterating over new_blocks loaded from include file 25675 1727203986.35362: in VariableManager get_vars() 25675 1727203986.35371: done with get_vars() 25675 1727203986.35372: filtering new block on tags 25675 1727203986.35384: done filtering new block on tags 25675 1727203986.35549: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 25675 1727203986.35552: extending task lists for all hosts with included blocks 25675 1727203986.35602: done extending task lists 25675 1727203986.35603: done processing included files 25675 1727203986.35604: results queue empty 25675 1727203986.35604: checking for any_errors_fatal 25675 1727203986.35605: done checking for any_errors_fatal 25675 1727203986.35606: checking for max_fail_percentage 25675 1727203986.35606: done checking for max_fail_percentage 25675 1727203986.35607: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.35607: done checking to see if all hosts have failed 25675 1727203986.35608: getting the remaining hosts for this loop 25675 1727203986.35609: done getting the remaining hosts for this loop 25675 1727203986.35610: getting the next task for host managed-node2 25675 1727203986.35612: done getting next task for host managed-node2 25675 1727203986.35613: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25675 1727203986.35615: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.35616: getting variables 25675 1727203986.35617: in VariableManager get_vars() 25675 1727203986.35622: Calling all_inventory to load vars for managed-node2 25675 1727203986.35623: Calling groups_inventory to load vars for managed-node2 25675 1727203986.35625: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.35628: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.35631: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.35633: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.35712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.35819: done with get_vars() 25675 1727203986.35825: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.019) 0:00:05.809 ***** 25675 1727203986.35874: entering _queue_task() for managed-node2/include_tasks 25675 1727203986.36056: worker is 1 (out of 1 available) 25675 1727203986.36068: exiting _queue_task() for managed-node2/include_tasks 25675 1727203986.36085: done queuing things up, now waiting for results queue to drain 25675 1727203986.36086: waiting for pending results... 25675 1727203986.36391: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 25675 1727203986.36400: in run() - task 028d2410-947f-41bd-b19d-000000000104 25675 1727203986.36402: variable 'ansible_search_path' from source: unknown 25675 1727203986.36404: variable 'ansible_search_path' from source: unknown 25675 1727203986.36407: calling self._execute() 25675 1727203986.36447: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.36457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.36468: variable 'omit' from source: magic vars 25675 1727203986.36772: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.36788: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.36797: _execute() done 25675 1727203986.36803: dumping result to json 25675 1727203986.36808: done dumping result, returning 25675 1727203986.36816: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-41bd-b19d-000000000104] 25675 1727203986.36823: sending task result for task 028d2410-947f-41bd-b19d-000000000104 25675 1727203986.36956: no more pending results, returning what we have 25675 1727203986.36961: in VariableManager get_vars() 25675 1727203986.36995: Calling all_inventory to load vars for managed-node2 25675 1727203986.36997: Calling groups_inventory to load vars for managed-node2 25675 1727203986.37001: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.37012: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.37017: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.37020: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.37167: done sending task result for task 028d2410-947f-41bd-b19d-000000000104 25675 1727203986.37172: WORKER PROCESS EXITING 25675 1727203986.37192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.37407: done with get_vars() 25675 1727203986.37413: variable 'ansible_search_path' from source: unknown 25675 1727203986.37414: variable 'ansible_search_path' from source: unknown 25675 1727203986.37451: we have included files to process 25675 1727203986.37452: generating all_blocks data 25675 1727203986.37454: done generating all_blocks data 25675 1727203986.37455: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203986.37456: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203986.37458: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203986.37731: done processing included file 25675 1727203986.37733: iterating over new_blocks loaded from include file 25675 1727203986.37734: in VariableManager get_vars() 25675 1727203986.37744: done with get_vars() 25675 1727203986.37745: filtering new block on tags 25675 1727203986.37765: done filtering new block on tags 25675 1727203986.37767: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 25675 1727203986.37773: extending task lists for all hosts with included blocks 25675 1727203986.37851: done extending task lists 25675 1727203986.37852: done processing included files 25675 1727203986.37853: results queue empty 25675 1727203986.37853: checking for any_errors_fatal 25675 1727203986.37855: done checking for any_errors_fatal 25675 1727203986.37856: checking for max_fail_percentage 25675 1727203986.37857: done checking for max_fail_percentage 25675 1727203986.37857: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.37858: done checking to see if all hosts have failed 25675 1727203986.37859: getting the remaining hosts for this loop 25675 1727203986.37860: done getting the remaining hosts for this loop 25675 1727203986.37868: getting the next task for host managed-node2 25675 1727203986.37874: done getting next task for host managed-node2 25675 1727203986.37880: ^ task is: TASK: Gather current interface info 25675 1727203986.37882: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.37884: getting variables 25675 1727203986.37885: in VariableManager get_vars() 25675 1727203986.37892: Calling all_inventory to load vars for managed-node2 25675 1727203986.37894: Calling groups_inventory to load vars for managed-node2 25675 1727203986.37896: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.37900: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.37902: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.37904: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.38030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.38212: done with get_vars() 25675 1727203986.38219: done getting variables 25675 1727203986.38250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.023) 0:00:05.833 ***** 25675 1727203986.38277: entering _queue_task() for managed-node2/command 25675 1727203986.38482: worker is 1 (out of 1 available) 25675 1727203986.38494: exiting _queue_task() for managed-node2/command 25675 1727203986.38505: done queuing things up, now waiting for results queue to drain 25675 1727203986.38506: waiting for pending results... 25675 1727203986.38636: running TaskExecutor() for managed-node2/TASK: Gather current interface info 25675 1727203986.38708: in run() - task 028d2410-947f-41bd-b19d-000000000115 25675 1727203986.38718: variable 'ansible_search_path' from source: unknown 25675 1727203986.38721: variable 'ansible_search_path' from source: unknown 25675 1727203986.38753: calling self._execute() 25675 1727203986.38811: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.38815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.38823: variable 'omit' from source: magic vars 25675 1727203986.39129: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.39138: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.39143: variable 'omit' from source: magic vars 25675 1727203986.39178: variable 'omit' from source: magic vars 25675 1727203986.39202: variable 'omit' from source: magic vars 25675 1727203986.39233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203986.39258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203986.39285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203986.39295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.39305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.39327: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203986.39331: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.39333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.39403: Set connection var ansible_shell_type to sh 25675 1727203986.39407: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203986.39413: Set connection var ansible_timeout to 10 25675 1727203986.39418: Set connection var ansible_pipelining to False 25675 1727203986.39422: Set connection var ansible_shell_executable to /bin/sh 25675 1727203986.39425: Set connection var ansible_connection to ssh 25675 1727203986.39444: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.39447: variable 'ansible_connection' from source: unknown 25675 1727203986.39449: variable 'ansible_module_compression' from source: unknown 25675 1727203986.39452: variable 'ansible_shell_type' from source: unknown 25675 1727203986.39454: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.39456: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.39459: variable 'ansible_pipelining' from source: unknown 25675 1727203986.39463: variable 'ansible_timeout' from source: unknown 25675 1727203986.39468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.39565: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203986.39577: variable 'omit' from source: magic vars 25675 1727203986.39583: starting attempt loop 25675 1727203986.39586: running the handler 25675 1727203986.39598: _low_level_execute_command(): starting 25675 1727203986.39610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203986.40112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.40115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.40118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.40120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.40180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203986.40187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203986.40189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.40268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.41962: stdout chunk (state=3): >>>/root <<< 25675 1727203986.42057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.42093: stderr chunk (state=3): >>><<< 25675 1727203986.42097: stdout chunk (state=3): >>><<< 25675 1727203986.42116: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203986.42127: _low_level_execute_command(): starting 25675 1727203986.42133: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877 `" && echo ansible-tmp-1727203986.4211578-26067-253076830931877="` echo /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877 `" ) && sleep 0' 25675 1727203986.42575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203986.42580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203986.42582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.42591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.42593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.42641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203986.42644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.42713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.44650: stdout chunk (state=3): >>>ansible-tmp-1727203986.4211578-26067-253076830931877=/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877 <<< 25675 1727203986.44753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.44782: stderr chunk (state=3): >>><<< 25675 1727203986.44785: stdout chunk (state=3): >>><<< 25675 1727203986.44800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203986.4211578-26067-253076830931877=/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203986.44823: variable 'ansible_module_compression' from source: unknown 25675 1727203986.44862: ANSIBALLZ: Using generic lock for ansible.legacy.command 25675 1727203986.44865: ANSIBALLZ: Acquiring lock 25675 1727203986.44867: ANSIBALLZ: Lock acquired: 139822507557424 25675 1727203986.44872: ANSIBALLZ: Creating module 25675 1727203986.56011: ANSIBALLZ: Writing module into payload 25675 1727203986.56014: ANSIBALLZ: Writing module 25675 1727203986.56017: ANSIBALLZ: Renaming module 25675 1727203986.56019: ANSIBALLZ: Done creating module 25675 1727203986.56021: variable 'ansible_facts' from source: unknown 25675 1727203986.56105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py 25675 1727203986.56384: Sending initial data 25675 1727203986.56387: Sent initial data (156 bytes) 25675 1727203986.57248: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203986.57440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203986.57622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.57938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.59631: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203986.59814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203986.59894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpe8ygaggs /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py <<< 25675 1727203986.59913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py" <<< 25675 1727203986.60051: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpe8ygaggs" to remote "/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py" <<< 25675 1727203986.60991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.61066: stderr chunk (state=3): >>><<< 25675 1727203986.61086: stdout chunk (state=3): >>><<< 25675 1727203986.61153: done transferring module to remote 25675 1727203986.61177: _low_level_execute_command(): starting 25675 1727203986.61189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/ /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py && sleep 0' 25675 1727203986.61893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.61938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203986.61955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203986.61981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.62129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.63991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.64049: stderr chunk (state=3): >>><<< 25675 1727203986.64053: stdout chunk (state=3): >>><<< 25675 1727203986.64071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203986.64152: _low_level_execute_command(): starting 25675 1727203986.64155: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/AnsiballZ_command.py && sleep 0' 25675 1727203986.64691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203986.64701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203986.64713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.64791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.64820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203986.64833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203986.64852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.65091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.81100: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:06.805383", "end": "2024-09-24 14:53:06.808793", "delta": "0:00:00.003410", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203986.82582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.82597: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 25675 1727203986.82634: stderr chunk (state=3): >>><<< 25675 1727203986.82636: stdout chunk (state=3): >>><<< 25675 1727203986.82647: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:06.805383", "end": "2024-09-24 14:53:06.808793", "delta": "0:00:00.003410", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203986.82714: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203986.82718: _low_level_execute_command(): starting 25675 1727203986.82720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203986.4211578-26067-253076830931877/ > /dev/null 2>&1 && sleep 0' 25675 1727203986.83122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.83125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203986.83127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727203986.83129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203986.83136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203986.83186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203986.83197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203986.83264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203986.85282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203986.85286: stdout chunk (state=3): >>><<< 25675 1727203986.85288: stderr chunk (state=3): >>><<< 25675 1727203986.85294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203986.85313: handler run complete 25675 1727203986.85323: Evaluated conditional (False): False 25675 1727203986.85390: attempt loop complete, returning result 25675 1727203986.85394: _execute() done 25675 1727203986.85396: dumping result to json 25675 1727203986.85398: done dumping result, returning 25675 1727203986.85400: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-41bd-b19d-000000000115] 25675 1727203986.85402: sending task result for task 028d2410-947f-41bd-b19d-000000000115 25675 1727203986.85727: done sending task result for task 028d2410-947f-41bd-b19d-000000000115 25675 1727203986.85730: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003410", "end": "2024-09-24 14:53:06.808793", "rc": 0, "start": "2024-09-24 14:53:06.805383" } STDOUT: bonding_masters eth0 lo 25675 1727203986.85812: no more pending results, returning what we have 25675 1727203986.85815: results queue empty 25675 1727203986.85816: checking for any_errors_fatal 25675 1727203986.85817: done checking for any_errors_fatal 25675 1727203986.85817: checking for max_fail_percentage 25675 1727203986.85819: done checking for max_fail_percentage 25675 1727203986.85819: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.85820: done checking to see if all hosts have failed 25675 1727203986.85821: getting the remaining hosts for this loop 25675 1727203986.85822: done getting the remaining hosts for this loop 25675 1727203986.85825: getting the next task for host managed-node2 25675 1727203986.85830: done getting next task for host managed-node2 25675 1727203986.85832: ^ task is: TASK: Set current_interfaces 25675 1727203986.85835: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.85838: getting variables 25675 1727203986.85910: in VariableManager get_vars() 25675 1727203986.85963: Calling all_inventory to load vars for managed-node2 25675 1727203986.85967: Calling groups_inventory to load vars for managed-node2 25675 1727203986.85972: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.85984: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.85986: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.85989: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.86245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.86504: done with get_vars() 25675 1727203986.86515: done getting variables 25675 1727203986.86574: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.483) 0:00:06.317 ***** 25675 1727203986.86613: entering _queue_task() for managed-node2/set_fact 25675 1727203986.87029: worker is 1 (out of 1 available) 25675 1727203986.87041: exiting _queue_task() for managed-node2/set_fact 25675 1727203986.87054: done queuing things up, now waiting for results queue to drain 25675 1727203986.87055: waiting for pending results... 25675 1727203986.87397: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 25675 1727203986.87401: in run() - task 028d2410-947f-41bd-b19d-000000000116 25675 1727203986.87404: variable 'ansible_search_path' from source: unknown 25675 1727203986.87406: variable 'ansible_search_path' from source: unknown 25675 1727203986.87433: calling self._execute() 25675 1727203986.87546: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.87558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.87572: variable 'omit' from source: magic vars 25675 1727203986.88260: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.88287: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.88298: variable 'omit' from source: magic vars 25675 1727203986.88481: variable 'omit' from source: magic vars 25675 1727203986.88578: variable '_current_interfaces' from source: set_fact 25675 1727203986.88807: variable 'omit' from source: magic vars 25675 1727203986.88882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203986.88963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203986.89037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203986.89061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.89087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.89159: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203986.89168: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.89184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.89323: Set connection var ansible_shell_type to sh 25675 1727203986.89364: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203986.89380: Set connection var ansible_timeout to 10 25675 1727203986.89390: Set connection var ansible_pipelining to False 25675 1727203986.89399: Set connection var ansible_shell_executable to /bin/sh 25675 1727203986.89405: Set connection var ansible_connection to ssh 25675 1727203986.89485: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.89488: variable 'ansible_connection' from source: unknown 25675 1727203986.89490: variable 'ansible_module_compression' from source: unknown 25675 1727203986.89492: variable 'ansible_shell_type' from source: unknown 25675 1727203986.89493: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.89576: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.89580: variable 'ansible_pipelining' from source: unknown 25675 1727203986.89582: variable 'ansible_timeout' from source: unknown 25675 1727203986.89584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.89788: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203986.89791: variable 'omit' from source: magic vars 25675 1727203986.89793: starting attempt loop 25675 1727203986.89795: running the handler 25675 1727203986.89796: handler run complete 25675 1727203986.89798: attempt loop complete, returning result 25675 1727203986.89800: _execute() done 25675 1727203986.89801: dumping result to json 25675 1727203986.89803: done dumping result, returning 25675 1727203986.89811: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-41bd-b19d-000000000116] 25675 1727203986.89818: sending task result for task 028d2410-947f-41bd-b19d-000000000116 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25675 1727203986.90081: no more pending results, returning what we have 25675 1727203986.90086: results queue empty 25675 1727203986.90087: checking for any_errors_fatal 25675 1727203986.90096: done checking for any_errors_fatal 25675 1727203986.90097: checking for max_fail_percentage 25675 1727203986.90099: done checking for max_fail_percentage 25675 1727203986.90099: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.90100: done checking to see if all hosts have failed 25675 1727203986.90101: getting the remaining hosts for this loop 25675 1727203986.90102: done getting the remaining hosts for this loop 25675 1727203986.90108: getting the next task for host managed-node2 25675 1727203986.90116: done getting next task for host managed-node2 25675 1727203986.90120: ^ task is: TASK: Show current_interfaces 25675 1727203986.90123: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.90128: getting variables 25675 1727203986.90130: in VariableManager get_vars() 25675 1727203986.90161: Calling all_inventory to load vars for managed-node2 25675 1727203986.90164: Calling groups_inventory to load vars for managed-node2 25675 1727203986.90168: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.90298: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.90302: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.90307: done sending task result for task 028d2410-947f-41bd-b19d-000000000116 25675 1727203986.90309: WORKER PROCESS EXITING 25675 1727203986.90313: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.90591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.90827: done with get_vars() 25675 1727203986.90842: done getting variables 25675 1727203986.90899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.043) 0:00:06.360 ***** 25675 1727203986.90927: entering _queue_task() for managed-node2/debug 25675 1727203986.91207: worker is 1 (out of 1 available) 25675 1727203986.91222: exiting _queue_task() for managed-node2/debug 25675 1727203986.91234: done queuing things up, now waiting for results queue to drain 25675 1727203986.91235: waiting for pending results... 25675 1727203986.91540: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 25675 1727203986.91681: in run() - task 028d2410-947f-41bd-b19d-000000000105 25675 1727203986.91701: variable 'ansible_search_path' from source: unknown 25675 1727203986.91709: variable 'ansible_search_path' from source: unknown 25675 1727203986.91751: calling self._execute() 25675 1727203986.91896: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.91900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.91902: variable 'omit' from source: magic vars 25675 1727203986.92230: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.92248: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.92259: variable 'omit' from source: magic vars 25675 1727203986.92303: variable 'omit' from source: magic vars 25675 1727203986.92405: variable 'current_interfaces' from source: set_fact 25675 1727203986.92439: variable 'omit' from source: magic vars 25675 1727203986.92485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203986.92524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203986.92549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203986.92573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.92680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203986.92683: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203986.92686: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.92688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.92735: Set connection var ansible_shell_type to sh 25675 1727203986.92746: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203986.92755: Set connection var ansible_timeout to 10 25675 1727203986.92763: Set connection var ansible_pipelining to False 25675 1727203986.92774: Set connection var ansible_shell_executable to /bin/sh 25675 1727203986.92783: Set connection var ansible_connection to ssh 25675 1727203986.92811: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.92819: variable 'ansible_connection' from source: unknown 25675 1727203986.92825: variable 'ansible_module_compression' from source: unknown 25675 1727203986.92831: variable 'ansible_shell_type' from source: unknown 25675 1727203986.92837: variable 'ansible_shell_executable' from source: unknown 25675 1727203986.92843: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.92849: variable 'ansible_pipelining' from source: unknown 25675 1727203986.92855: variable 'ansible_timeout' from source: unknown 25675 1727203986.92862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.93000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203986.93016: variable 'omit' from source: magic vars 25675 1727203986.93025: starting attempt loop 25675 1727203986.93031: running the handler 25675 1727203986.93081: handler run complete 25675 1727203986.93180: attempt loop complete, returning result 25675 1727203986.93184: _execute() done 25675 1727203986.93186: dumping result to json 25675 1727203986.93188: done dumping result, returning 25675 1727203986.93191: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-41bd-b19d-000000000105] 25675 1727203986.93193: sending task result for task 028d2410-947f-41bd-b19d-000000000105 25675 1727203986.93255: done sending task result for task 028d2410-947f-41bd-b19d-000000000105 25675 1727203986.93259: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25675 1727203986.93306: no more pending results, returning what we have 25675 1727203986.93309: results queue empty 25675 1727203986.93310: checking for any_errors_fatal 25675 1727203986.93316: done checking for any_errors_fatal 25675 1727203986.93316: checking for max_fail_percentage 25675 1727203986.93318: done checking for max_fail_percentage 25675 1727203986.93318: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.93319: done checking to see if all hosts have failed 25675 1727203986.93320: getting the remaining hosts for this loop 25675 1727203986.93325: done getting the remaining hosts for this loop 25675 1727203986.93329: getting the next task for host managed-node2 25675 1727203986.93336: done getting next task for host managed-node2 25675 1727203986.93340: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25675 1727203986.93341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.93345: getting variables 25675 1727203986.93346: in VariableManager get_vars() 25675 1727203986.93373: Calling all_inventory to load vars for managed-node2 25675 1727203986.93378: Calling groups_inventory to load vars for managed-node2 25675 1727203986.93381: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.93391: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.93393: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.93396: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.93672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.93875: done with get_vars() 25675 1727203986.93902: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.031) 0:00:06.392 ***** 25675 1727203986.94100: entering _queue_task() for managed-node2/include_tasks 25675 1727203986.94365: worker is 1 (out of 1 available) 25675 1727203986.94380: exiting _queue_task() for managed-node2/include_tasks 25675 1727203986.94393: done queuing things up, now waiting for results queue to drain 25675 1727203986.94394: waiting for pending results... 25675 1727203986.94693: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 25675 1727203986.94726: in run() - task 028d2410-947f-41bd-b19d-000000000011 25675 1727203986.94744: variable 'ansible_search_path' from source: unknown 25675 1727203986.94788: calling self._execute() 25675 1727203986.94881: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.94885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.94893: variable 'omit' from source: magic vars 25675 1727203986.95328: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.95336: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.95383: _execute() done 25675 1727203986.95387: dumping result to json 25675 1727203986.95392: done dumping result, returning 25675 1727203986.95395: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-41bd-b19d-000000000011] 25675 1727203986.95397: sending task result for task 028d2410-947f-41bd-b19d-000000000011 25675 1727203986.95543: done sending task result for task 028d2410-947f-41bd-b19d-000000000011 25675 1727203986.95546: WORKER PROCESS EXITING 25675 1727203986.95574: no more pending results, returning what we have 25675 1727203986.95581: in VariableManager get_vars() 25675 1727203986.95616: Calling all_inventory to load vars for managed-node2 25675 1727203986.95620: Calling groups_inventory to load vars for managed-node2 25675 1727203986.95624: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.95636: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.95639: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.95642: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.96050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.96208: done with get_vars() 25675 1727203986.96214: variable 'ansible_search_path' from source: unknown 25675 1727203986.96225: we have included files to process 25675 1727203986.96226: generating all_blocks data 25675 1727203986.96227: done generating all_blocks data 25675 1727203986.96231: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25675 1727203986.96232: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25675 1727203986.96234: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25675 1727203986.96747: in VariableManager get_vars() 25675 1727203986.96763: done with get_vars() 25675 1727203986.96986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 25675 1727203986.97572: done processing included file 25675 1727203986.97574: iterating over new_blocks loaded from include file 25675 1727203986.97577: in VariableManager get_vars() 25675 1727203986.97589: done with get_vars() 25675 1727203986.97591: filtering new block on tags 25675 1727203986.97621: done filtering new block on tags 25675 1727203986.97624: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 25675 1727203986.97629: extending task lists for all hosts with included blocks 25675 1727203986.97799: done extending task lists 25675 1727203986.97800: done processing included files 25675 1727203986.97801: results queue empty 25675 1727203986.97802: checking for any_errors_fatal 25675 1727203986.97804: done checking for any_errors_fatal 25675 1727203986.97805: checking for max_fail_percentage 25675 1727203986.97806: done checking for max_fail_percentage 25675 1727203986.97807: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.97808: done checking to see if all hosts have failed 25675 1727203986.97808: getting the remaining hosts for this loop 25675 1727203986.97809: done getting the remaining hosts for this loop 25675 1727203986.97812: getting the next task for host managed-node2 25675 1727203986.97815: done getting next task for host managed-node2 25675 1727203986.97817: ^ task is: TASK: Ensure state in ["present", "absent"] 25675 1727203986.97820: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.97822: getting variables 25675 1727203986.97823: in VariableManager get_vars() 25675 1727203986.97831: Calling all_inventory to load vars for managed-node2 25675 1727203986.97833: Calling groups_inventory to load vars for managed-node2 25675 1727203986.97835: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.97840: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.97842: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203986.97845: Calling groups_plugins_play to load vars for managed-node2 25675 1727203986.97986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203986.98166: done with get_vars() 25675 1727203986.98179: done getting variables 25675 1727203986.98241: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.041) 0:00:06.433 ***** 25675 1727203986.98271: entering _queue_task() for managed-node2/fail 25675 1727203986.98273: Creating lock for fail 25675 1727203986.98551: worker is 1 (out of 1 available) 25675 1727203986.98561: exiting _queue_task() for managed-node2/fail 25675 1727203986.98573: done queuing things up, now waiting for results queue to drain 25675 1727203986.98575: waiting for pending results... 25675 1727203986.98896: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 25675 1727203986.98981: in run() - task 028d2410-947f-41bd-b19d-000000000131 25675 1727203986.98985: variable 'ansible_search_path' from source: unknown 25675 1727203986.98987: variable 'ansible_search_path' from source: unknown 25675 1727203986.98989: calling self._execute() 25675 1727203986.99104: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203986.99107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203986.99109: variable 'omit' from source: magic vars 25675 1727203986.99466: variable 'ansible_distribution_major_version' from source: facts 25675 1727203986.99487: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203986.99625: variable 'state' from source: include params 25675 1727203986.99636: Evaluated conditional (state not in ["present", "absent"]): False 25675 1727203986.99648: when evaluation is False, skipping this task 25675 1727203986.99656: _execute() done 25675 1727203986.99663: dumping result to json 25675 1727203986.99757: done dumping result, returning 25675 1727203986.99760: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [028d2410-947f-41bd-b19d-000000000131] 25675 1727203986.99763: sending task result for task 028d2410-947f-41bd-b19d-000000000131 25675 1727203986.99828: done sending task result for task 028d2410-947f-41bd-b19d-000000000131 25675 1727203986.99832: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25675 1727203986.99910: no more pending results, returning what we have 25675 1727203986.99914: results queue empty 25675 1727203986.99915: checking for any_errors_fatal 25675 1727203986.99916: done checking for any_errors_fatal 25675 1727203986.99917: checking for max_fail_percentage 25675 1727203986.99918: done checking for max_fail_percentage 25675 1727203986.99919: checking to see if all hosts have failed and the running result is not ok 25675 1727203986.99920: done checking to see if all hosts have failed 25675 1727203986.99921: getting the remaining hosts for this loop 25675 1727203986.99922: done getting the remaining hosts for this loop 25675 1727203986.99926: getting the next task for host managed-node2 25675 1727203986.99932: done getting next task for host managed-node2 25675 1727203986.99934: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25675 1727203986.99937: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203986.99940: getting variables 25675 1727203986.99942: in VariableManager get_vars() 25675 1727203986.99972: Calling all_inventory to load vars for managed-node2 25675 1727203986.99977: Calling groups_inventory to load vars for managed-node2 25675 1727203986.99981: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203986.99996: Calling all_plugins_play to load vars for managed-node2 25675 1727203986.99999: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.00002: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.00364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.00558: done with get_vars() 25675 1727203987.00567: done getting variables 25675 1727203987.00621: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.023) 0:00:06.457 ***** 25675 1727203987.00646: entering _queue_task() for managed-node2/fail 25675 1727203987.00879: worker is 1 (out of 1 available) 25675 1727203987.00890: exiting _queue_task() for managed-node2/fail 25675 1727203987.00901: done queuing things up, now waiting for results queue to drain 25675 1727203987.00902: waiting for pending results... 25675 1727203987.01295: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 25675 1727203987.01299: in run() - task 028d2410-947f-41bd-b19d-000000000132 25675 1727203987.01302: variable 'ansible_search_path' from source: unknown 25675 1727203987.01305: variable 'ansible_search_path' from source: unknown 25675 1727203987.01308: calling self._execute() 25675 1727203987.01359: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.01373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.01393: variable 'omit' from source: magic vars 25675 1727203987.01738: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.01753: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.01906: variable 'type' from source: set_fact 25675 1727203987.01916: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25675 1727203987.01922: when evaluation is False, skipping this task 25675 1727203987.01931: _execute() done 25675 1727203987.01940: dumping result to json 25675 1727203987.01947: done dumping result, returning 25675 1727203987.01956: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-41bd-b19d-000000000132] 25675 1727203987.01965: sending task result for task 028d2410-947f-41bd-b19d-000000000132 25675 1727203987.02195: done sending task result for task 028d2410-947f-41bd-b19d-000000000132 25675 1727203987.02198: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25675 1727203987.02240: no more pending results, returning what we have 25675 1727203987.02243: results queue empty 25675 1727203987.02244: checking for any_errors_fatal 25675 1727203987.02249: done checking for any_errors_fatal 25675 1727203987.02250: checking for max_fail_percentage 25675 1727203987.02251: done checking for max_fail_percentage 25675 1727203987.02252: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.02252: done checking to see if all hosts have failed 25675 1727203987.02253: getting the remaining hosts for this loop 25675 1727203987.02254: done getting the remaining hosts for this loop 25675 1727203987.02257: getting the next task for host managed-node2 25675 1727203987.02263: done getting next task for host managed-node2 25675 1727203987.02265: ^ task is: TASK: Include the task 'show_interfaces.yml' 25675 1727203987.02267: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.02272: getting variables 25675 1727203987.02274: in VariableManager get_vars() 25675 1727203987.02301: Calling all_inventory to load vars for managed-node2 25675 1727203987.02303: Calling groups_inventory to load vars for managed-node2 25675 1727203987.02306: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.02315: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.02318: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.02320: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.02558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.02786: done with get_vars() 25675 1727203987.02796: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.022) 0:00:06.480 ***** 25675 1727203987.02888: entering _queue_task() for managed-node2/include_tasks 25675 1727203987.03127: worker is 1 (out of 1 available) 25675 1727203987.03139: exiting _queue_task() for managed-node2/include_tasks 25675 1727203987.03151: done queuing things up, now waiting for results queue to drain 25675 1727203987.03152: waiting for pending results... 25675 1727203987.03405: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 25675 1727203987.03511: in run() - task 028d2410-947f-41bd-b19d-000000000133 25675 1727203987.03581: variable 'ansible_search_path' from source: unknown 25675 1727203987.03585: variable 'ansible_search_path' from source: unknown 25675 1727203987.03588: calling self._execute() 25675 1727203987.03653: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.03666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.03685: variable 'omit' from source: magic vars 25675 1727203987.04050: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.04068: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.04083: _execute() done 25675 1727203987.04090: dumping result to json 25675 1727203987.04180: done dumping result, returning 25675 1727203987.04184: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-41bd-b19d-000000000133] 25675 1727203987.04186: sending task result for task 028d2410-947f-41bd-b19d-000000000133 25675 1727203987.04242: done sending task result for task 028d2410-947f-41bd-b19d-000000000133 25675 1727203987.04244: WORKER PROCESS EXITING 25675 1727203987.04271: no more pending results, returning what we have 25675 1727203987.04278: in VariableManager get_vars() 25675 1727203987.04309: Calling all_inventory to load vars for managed-node2 25675 1727203987.04312: Calling groups_inventory to load vars for managed-node2 25675 1727203987.04316: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.04329: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.04332: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.04335: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.04653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.04843: done with get_vars() 25675 1727203987.04850: variable 'ansible_search_path' from source: unknown 25675 1727203987.04851: variable 'ansible_search_path' from source: unknown 25675 1727203987.04889: we have included files to process 25675 1727203987.04890: generating all_blocks data 25675 1727203987.04892: done generating all_blocks data 25675 1727203987.04897: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203987.04898: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203987.04900: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25675 1727203987.04998: in VariableManager get_vars() 25675 1727203987.05015: done with get_vars() 25675 1727203987.05121: done processing included file 25675 1727203987.05123: iterating over new_blocks loaded from include file 25675 1727203987.05124: in VariableManager get_vars() 25675 1727203987.05135: done with get_vars() 25675 1727203987.05137: filtering new block on tags 25675 1727203987.05154: done filtering new block on tags 25675 1727203987.05156: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 25675 1727203987.05160: extending task lists for all hosts with included blocks 25675 1727203987.05545: done extending task lists 25675 1727203987.05546: done processing included files 25675 1727203987.05547: results queue empty 25675 1727203987.05548: checking for any_errors_fatal 25675 1727203987.05550: done checking for any_errors_fatal 25675 1727203987.05551: checking for max_fail_percentage 25675 1727203987.05552: done checking for max_fail_percentage 25675 1727203987.05553: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.05554: done checking to see if all hosts have failed 25675 1727203987.05554: getting the remaining hosts for this loop 25675 1727203987.05556: done getting the remaining hosts for this loop 25675 1727203987.05558: getting the next task for host managed-node2 25675 1727203987.05562: done getting next task for host managed-node2 25675 1727203987.05564: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25675 1727203987.05567: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.05572: getting variables 25675 1727203987.05573: in VariableManager get_vars() 25675 1727203987.05584: Calling all_inventory to load vars for managed-node2 25675 1727203987.05586: Calling groups_inventory to load vars for managed-node2 25675 1727203987.05588: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.05593: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.05595: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.05598: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.05758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.05952: done with get_vars() 25675 1727203987.05960: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.031) 0:00:06.511 ***** 25675 1727203987.06032: entering _queue_task() for managed-node2/include_tasks 25675 1727203987.06489: worker is 1 (out of 1 available) 25675 1727203987.06496: exiting _queue_task() for managed-node2/include_tasks 25675 1727203987.06505: done queuing things up, now waiting for results queue to drain 25675 1727203987.06506: waiting for pending results... 25675 1727203987.06633: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 25675 1727203987.06665: in run() - task 028d2410-947f-41bd-b19d-00000000015c 25675 1727203987.06687: variable 'ansible_search_path' from source: unknown 25675 1727203987.06695: variable 'ansible_search_path' from source: unknown 25675 1727203987.06736: calling self._execute() 25675 1727203987.06838: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.06842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.06844: variable 'omit' from source: magic vars 25675 1727203987.07192: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.07207: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.07276: _execute() done 25675 1727203987.07280: dumping result to json 25675 1727203987.07282: done dumping result, returning 25675 1727203987.07285: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-41bd-b19d-00000000015c] 25675 1727203987.07287: sending task result for task 028d2410-947f-41bd-b19d-00000000015c 25675 1727203987.07349: done sending task result for task 028d2410-947f-41bd-b19d-00000000015c 25675 1727203987.07353: WORKER PROCESS EXITING 25675 1727203987.07404: no more pending results, returning what we have 25675 1727203987.07409: in VariableManager get_vars() 25675 1727203987.07442: Calling all_inventory to load vars for managed-node2 25675 1727203987.07446: Calling groups_inventory to load vars for managed-node2 25675 1727203987.07449: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.07464: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.07467: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.07473: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.07745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.08050: done with get_vars() 25675 1727203987.08058: variable 'ansible_search_path' from source: unknown 25675 1727203987.08059: variable 'ansible_search_path' from source: unknown 25675 1727203987.08121: we have included files to process 25675 1727203987.08123: generating all_blocks data 25675 1727203987.08124: done generating all_blocks data 25675 1727203987.08125: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203987.08126: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203987.08128: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25675 1727203987.08389: done processing included file 25675 1727203987.08395: iterating over new_blocks loaded from include file 25675 1727203987.08397: in VariableManager get_vars() 25675 1727203987.08411: done with get_vars() 25675 1727203987.08412: filtering new block on tags 25675 1727203987.08431: done filtering new block on tags 25675 1727203987.08433: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 25675 1727203987.08437: extending task lists for all hosts with included blocks 25675 1727203987.08588: done extending task lists 25675 1727203987.08590: done processing included files 25675 1727203987.08591: results queue empty 25675 1727203987.08591: checking for any_errors_fatal 25675 1727203987.08594: done checking for any_errors_fatal 25675 1727203987.08595: checking for max_fail_percentage 25675 1727203987.08596: done checking for max_fail_percentage 25675 1727203987.08596: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.08597: done checking to see if all hosts have failed 25675 1727203987.08598: getting the remaining hosts for this loop 25675 1727203987.08599: done getting the remaining hosts for this loop 25675 1727203987.08601: getting the next task for host managed-node2 25675 1727203987.08606: done getting next task for host managed-node2 25675 1727203987.08608: ^ task is: TASK: Gather current interface info 25675 1727203987.08611: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.08614: getting variables 25675 1727203987.08615: in VariableManager get_vars() 25675 1727203987.08622: Calling all_inventory to load vars for managed-node2 25675 1727203987.08624: Calling groups_inventory to load vars for managed-node2 25675 1727203987.08626: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.08631: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.08633: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.08636: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.08783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.08967: done with get_vars() 25675 1727203987.08980: done getting variables 25675 1727203987.09017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.030) 0:00:06.541 ***** 25675 1727203987.09045: entering _queue_task() for managed-node2/command 25675 1727203987.09495: worker is 1 (out of 1 available) 25675 1727203987.09503: exiting _queue_task() for managed-node2/command 25675 1727203987.09514: done queuing things up, now waiting for results queue to drain 25675 1727203987.09515: waiting for pending results... 25675 1727203987.09642: running TaskExecutor() for managed-node2/TASK: Gather current interface info 25675 1727203987.09672: in run() - task 028d2410-947f-41bd-b19d-000000000193 25675 1727203987.09692: variable 'ansible_search_path' from source: unknown 25675 1727203987.09700: variable 'ansible_search_path' from source: unknown 25675 1727203987.09741: calling self._execute() 25675 1727203987.09826: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.09847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.09957: variable 'omit' from source: magic vars 25675 1727203987.10258: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.10278: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.10294: variable 'omit' from source: magic vars 25675 1727203987.10347: variable 'omit' from source: magic vars 25675 1727203987.10394: variable 'omit' from source: magic vars 25675 1727203987.10438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203987.10505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203987.10517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203987.10720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.10724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.10726: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203987.10728: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.10730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.10880: Set connection var ansible_shell_type to sh 25675 1727203987.10893: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203987.10902: Set connection var ansible_timeout to 10 25675 1727203987.10941: Set connection var ansible_pipelining to False 25675 1727203987.10952: Set connection var ansible_shell_executable to /bin/sh 25675 1727203987.10959: Set connection var ansible_connection to ssh 25675 1727203987.11072: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.11088: variable 'ansible_connection' from source: unknown 25675 1727203987.11101: variable 'ansible_module_compression' from source: unknown 25675 1727203987.11109: variable 'ansible_shell_type' from source: unknown 25675 1727203987.11118: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.11126: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.11134: variable 'ansible_pipelining' from source: unknown 25675 1727203987.11140: variable 'ansible_timeout' from source: unknown 25675 1727203987.11147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.11295: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203987.11316: variable 'omit' from source: magic vars 25675 1727203987.11327: starting attempt loop 25675 1727203987.11334: running the handler 25675 1727203987.11351: _low_level_execute_command(): starting 25675 1727203987.11363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203987.12316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.12361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203987.12387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.12461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.12736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.14339: stdout chunk (state=3): >>>/root <<< 25675 1727203987.14474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.14551: stderr chunk (state=3): >>><<< 25675 1727203987.14554: stdout chunk (state=3): >>><<< 25675 1727203987.14639: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.14644: _low_level_execute_command(): starting 25675 1727203987.14853: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642 `" && echo ansible-tmp-1727203987.1461608-26125-30308274087642="` echo /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642 `" ) && sleep 0' 25675 1727203987.16072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.16250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.16289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.16416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.18627: stdout chunk (state=3): >>>ansible-tmp-1727203987.1461608-26125-30308274087642=/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642 <<< 25675 1727203987.18681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.18684: stdout chunk (state=3): >>><<< 25675 1727203987.18697: stderr chunk (state=3): >>><<< 25675 1727203987.18716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203987.1461608-26125-30308274087642=/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.18751: variable 'ansible_module_compression' from source: unknown 25675 1727203987.19018: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727203987.19126: variable 'ansible_facts' from source: unknown 25675 1727203987.19129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py 25675 1727203987.19506: Sending initial data 25675 1727203987.19509: Sent initial data (155 bytes) 25675 1727203987.20795: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.20931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203987.21005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.21104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.21174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.23023: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203987.23093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203987.23159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpvv_9d1tz /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py <<< 25675 1727203987.23163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py" <<< 25675 1727203987.23251: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpvv_9d1tz" to remote "/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py" <<< 25675 1727203987.25071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.25078: stdout chunk (state=3): >>><<< 25675 1727203987.25081: stderr chunk (state=3): >>><<< 25675 1727203987.25138: done transferring module to remote 25675 1727203987.25154: _low_level_execute_command(): starting 25675 1727203987.25188: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/ /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py && sleep 0' 25675 1727203987.26731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203987.26990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.27104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.27148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.27286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.29218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.29232: stdout chunk (state=3): >>><<< 25675 1727203987.29714: stderr chunk (state=3): >>><<< 25675 1727203987.29718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.29724: _low_level_execute_command(): starting 25675 1727203987.29727: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/AnsiballZ_command.py && sleep 0' 25675 1727203987.30861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.30982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.31096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.46869: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:07.463685", "end": "2024-09-24 14:53:07.467170", "delta": "0:00:00.003485", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203987.48430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.48594: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 25675 1727203987.48597: stdout chunk (state=3): >>><<< 25675 1727203987.48600: stderr chunk (state=3): >>><<< 25675 1727203987.48616: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:07.463685", "end": "2024-09-24 14:53:07.467170", "delta": "0:00:00.003485", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203987.48666: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203987.48746: _low_level_execute_command(): starting 25675 1727203987.48756: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203987.1461608-26125-30308274087642/ > /dev/null 2>&1 && sleep 0' 25675 1727203987.49925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203987.49943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203987.49948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.50212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.50229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.50362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.52261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.52266: stdout chunk (state=3): >>><<< 25675 1727203987.52272: stderr chunk (state=3): >>><<< 25675 1727203987.52296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.52305: handler run complete 25675 1727203987.52481: Evaluated conditional (False): False 25675 1727203987.52485: attempt loop complete, returning result 25675 1727203987.52487: _execute() done 25675 1727203987.52490: dumping result to json 25675 1727203987.52492: done dumping result, returning 25675 1727203987.52495: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-41bd-b19d-000000000193] 25675 1727203987.52497: sending task result for task 028d2410-947f-41bd-b19d-000000000193 25675 1727203987.52561: done sending task result for task 028d2410-947f-41bd-b19d-000000000193 25675 1727203987.52565: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003485", "end": "2024-09-24 14:53:07.467170", "rc": 0, "start": "2024-09-24 14:53:07.463685" } STDOUT: bonding_masters eth0 lo 25675 1727203987.52650: no more pending results, returning what we have 25675 1727203987.52654: results queue empty 25675 1727203987.52655: checking for any_errors_fatal 25675 1727203987.52656: done checking for any_errors_fatal 25675 1727203987.52657: checking for max_fail_percentage 25675 1727203987.52659: done checking for max_fail_percentage 25675 1727203987.52660: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.52661: done checking to see if all hosts have failed 25675 1727203987.52661: getting the remaining hosts for this loop 25675 1727203987.52664: done getting the remaining hosts for this loop 25675 1727203987.52668: getting the next task for host managed-node2 25675 1727203987.52682: done getting next task for host managed-node2 25675 1727203987.52685: ^ task is: TASK: Set current_interfaces 25675 1727203987.52690: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.52695: getting variables 25675 1727203987.52697: in VariableManager get_vars() 25675 1727203987.52882: Calling all_inventory to load vars for managed-node2 25675 1727203987.52895: Calling groups_inventory to load vars for managed-node2 25675 1727203987.52900: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.52911: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.52914: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.52917: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.53195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.53398: done with get_vars() 25675 1727203987.53410: done getting variables 25675 1727203987.53486: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.444) 0:00:06.986 ***** 25675 1727203987.53519: entering _queue_task() for managed-node2/set_fact 25675 1727203987.53910: worker is 1 (out of 1 available) 25675 1727203987.53920: exiting _queue_task() for managed-node2/set_fact 25675 1727203987.53930: done queuing things up, now waiting for results queue to drain 25675 1727203987.53931: waiting for pending results... 25675 1727203987.54112: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 25675 1727203987.54233: in run() - task 028d2410-947f-41bd-b19d-000000000194 25675 1727203987.54253: variable 'ansible_search_path' from source: unknown 25675 1727203987.54262: variable 'ansible_search_path' from source: unknown 25675 1727203987.54314: calling self._execute() 25675 1727203987.54397: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.54415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.54481: variable 'omit' from source: magic vars 25675 1727203987.54916: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.54939: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.54957: variable 'omit' from source: magic vars 25675 1727203987.55018: variable 'omit' from source: magic vars 25675 1727203987.55157: variable '_current_interfaces' from source: set_fact 25675 1727203987.55243: variable 'omit' from source: magic vars 25675 1727203987.55375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203987.55380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203987.55382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203987.55389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.55407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.55441: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203987.55449: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.55456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.55581: Set connection var ansible_shell_type to sh 25675 1727203987.55584: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203987.55594: Set connection var ansible_timeout to 10 25675 1727203987.55610: Set connection var ansible_pipelining to False 25675 1727203987.55624: Set connection var ansible_shell_executable to /bin/sh 25675 1727203987.55630: Set connection var ansible_connection to ssh 25675 1727203987.55934: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.55937: variable 'ansible_connection' from source: unknown 25675 1727203987.55939: variable 'ansible_module_compression' from source: unknown 25675 1727203987.55941: variable 'ansible_shell_type' from source: unknown 25675 1727203987.55942: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.55944: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.55945: variable 'ansible_pipelining' from source: unknown 25675 1727203987.55947: variable 'ansible_timeout' from source: unknown 25675 1727203987.55948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.56067: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203987.56087: variable 'omit' from source: magic vars 25675 1727203987.56096: starting attempt loop 25675 1727203987.56103: running the handler 25675 1727203987.56117: handler run complete 25675 1727203987.56129: attempt loop complete, returning result 25675 1727203987.56135: _execute() done 25675 1727203987.56141: dumping result to json 25675 1727203987.56157: done dumping result, returning 25675 1727203987.56167: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-41bd-b19d-000000000194] 25675 1727203987.56180: sending task result for task 028d2410-947f-41bd-b19d-000000000194 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25675 1727203987.56445: no more pending results, returning what we have 25675 1727203987.56448: results queue empty 25675 1727203987.56449: checking for any_errors_fatal 25675 1727203987.56456: done checking for any_errors_fatal 25675 1727203987.56457: checking for max_fail_percentage 25675 1727203987.56458: done checking for max_fail_percentage 25675 1727203987.56460: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.56460: done checking to see if all hosts have failed 25675 1727203987.56461: getting the remaining hosts for this loop 25675 1727203987.56463: done getting the remaining hosts for this loop 25675 1727203987.56467: getting the next task for host managed-node2 25675 1727203987.56480: done getting next task for host managed-node2 25675 1727203987.56484: ^ task is: TASK: Show current_interfaces 25675 1727203987.56487: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.56492: getting variables 25675 1727203987.56493: in VariableManager get_vars() 25675 1727203987.56522: Calling all_inventory to load vars for managed-node2 25675 1727203987.56525: Calling groups_inventory to load vars for managed-node2 25675 1727203987.56528: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.56539: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.56542: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.56545: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.56931: done sending task result for task 028d2410-947f-41bd-b19d-000000000194 25675 1727203987.56935: WORKER PROCESS EXITING 25675 1727203987.56956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.57237: done with get_vars() 25675 1727203987.57247: done getting variables 25675 1727203987.57305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.038) 0:00:07.024 ***** 25675 1727203987.57335: entering _queue_task() for managed-node2/debug 25675 1727203987.57708: worker is 1 (out of 1 available) 25675 1727203987.57719: exiting _queue_task() for managed-node2/debug 25675 1727203987.57729: done queuing things up, now waiting for results queue to drain 25675 1727203987.57730: waiting for pending results... 25675 1727203987.57896: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 25675 1727203987.58019: in run() - task 028d2410-947f-41bd-b19d-00000000015d 25675 1727203987.58039: variable 'ansible_search_path' from source: unknown 25675 1727203987.58046: variable 'ansible_search_path' from source: unknown 25675 1727203987.58096: calling self._execute() 25675 1727203987.58230: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.58233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.58236: variable 'omit' from source: magic vars 25675 1727203987.58604: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.58620: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.58630: variable 'omit' from source: magic vars 25675 1727203987.58689: variable 'omit' from source: magic vars 25675 1727203987.58798: variable 'current_interfaces' from source: set_fact 25675 1727203987.58827: variable 'omit' from source: magic vars 25675 1727203987.58878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203987.58982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203987.58985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203987.58989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.58991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.59012: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203987.59019: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.59026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.59181: Set connection var ansible_shell_type to sh 25675 1727203987.59185: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203987.59187: Set connection var ansible_timeout to 10 25675 1727203987.59189: Set connection var ansible_pipelining to False 25675 1727203987.59197: Set connection var ansible_shell_executable to /bin/sh 25675 1727203987.59199: Set connection var ansible_connection to ssh 25675 1727203987.59213: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.59222: variable 'ansible_connection' from source: unknown 25675 1727203987.59230: variable 'ansible_module_compression' from source: unknown 25675 1727203987.59237: variable 'ansible_shell_type' from source: unknown 25675 1727203987.59243: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.59250: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.59258: variable 'ansible_pipelining' from source: unknown 25675 1727203987.59305: variable 'ansible_timeout' from source: unknown 25675 1727203987.59308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.59421: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203987.59441: variable 'omit' from source: magic vars 25675 1727203987.59449: starting attempt loop 25675 1727203987.59455: running the handler 25675 1727203987.59506: handler run complete 25675 1727203987.59539: attempt loop complete, returning result 25675 1727203987.59542: _execute() done 25675 1727203987.59581: dumping result to json 25675 1727203987.59584: done dumping result, returning 25675 1727203987.59587: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-41bd-b19d-00000000015d] 25675 1727203987.59589: sending task result for task 028d2410-947f-41bd-b19d-00000000015d 25675 1727203987.59936: done sending task result for task 028d2410-947f-41bd-b19d-00000000015d 25675 1727203987.59939: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25675 1727203987.59985: no more pending results, returning what we have 25675 1727203987.59988: results queue empty 25675 1727203987.59989: checking for any_errors_fatal 25675 1727203987.59994: done checking for any_errors_fatal 25675 1727203987.59995: checking for max_fail_percentage 25675 1727203987.59996: done checking for max_fail_percentage 25675 1727203987.59997: checking to see if all hosts have failed and the running result is not ok 25675 1727203987.59998: done checking to see if all hosts have failed 25675 1727203987.59999: getting the remaining hosts for this loop 25675 1727203987.60000: done getting the remaining hosts for this loop 25675 1727203987.60004: getting the next task for host managed-node2 25675 1727203987.60011: done getting next task for host managed-node2 25675 1727203987.60013: ^ task is: TASK: Install iproute 25675 1727203987.60016: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203987.60020: getting variables 25675 1727203987.60021: in VariableManager get_vars() 25675 1727203987.60047: Calling all_inventory to load vars for managed-node2 25675 1727203987.60072: Calling groups_inventory to load vars for managed-node2 25675 1727203987.60077: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203987.60087: Calling all_plugins_play to load vars for managed-node2 25675 1727203987.60090: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203987.60093: Calling groups_plugins_play to load vars for managed-node2 25675 1727203987.60340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203987.60564: done with get_vars() 25675 1727203987.60579: done getting variables 25675 1727203987.60639: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.033) 0:00:07.057 ***** 25675 1727203987.60673: entering _queue_task() for managed-node2/package 25675 1727203987.61056: worker is 1 (out of 1 available) 25675 1727203987.61068: exiting _queue_task() for managed-node2/package 25675 1727203987.61084: done queuing things up, now waiting for results queue to drain 25675 1727203987.61085: waiting for pending results... 25675 1727203987.61262: running TaskExecutor() for managed-node2/TASK: Install iproute 25675 1727203987.61358: in run() - task 028d2410-947f-41bd-b19d-000000000134 25675 1727203987.61387: variable 'ansible_search_path' from source: unknown 25675 1727203987.61395: variable 'ansible_search_path' from source: unknown 25675 1727203987.61436: calling self._execute() 25675 1727203987.61592: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.61606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.61610: variable 'omit' from source: magic vars 25675 1727203987.62626: variable 'ansible_distribution_major_version' from source: facts 25675 1727203987.62735: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203987.62738: variable 'omit' from source: magic vars 25675 1727203987.62740: variable 'omit' from source: magic vars 25675 1727203987.63173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203987.67452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203987.67620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203987.67711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203987.67985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203987.67988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203987.68064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203987.68222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203987.68254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203987.68334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203987.68396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203987.68684: variable '__network_is_ostree' from source: set_fact 25675 1727203987.68687: variable 'omit' from source: magic vars 25675 1727203987.68713: variable 'omit' from source: magic vars 25675 1727203987.68746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203987.68821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203987.68845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203987.68922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.68938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203987.68973: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203987.69043: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.69063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.69236: Set connection var ansible_shell_type to sh 25675 1727203987.69247: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203987.69291: Set connection var ansible_timeout to 10 25675 1727203987.69303: Set connection var ansible_pipelining to False 25675 1727203987.69312: Set connection var ansible_shell_executable to /bin/sh 25675 1727203987.69317: Set connection var ansible_connection to ssh 25675 1727203987.69357: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.69365: variable 'ansible_connection' from source: unknown 25675 1727203987.69377: variable 'ansible_module_compression' from source: unknown 25675 1727203987.69386: variable 'ansible_shell_type' from source: unknown 25675 1727203987.69393: variable 'ansible_shell_executable' from source: unknown 25675 1727203987.69399: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203987.69406: variable 'ansible_pipelining' from source: unknown 25675 1727203987.69413: variable 'ansible_timeout' from source: unknown 25675 1727203987.69420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203987.69529: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203987.69544: variable 'omit' from source: magic vars 25675 1727203987.69577: starting attempt loop 25675 1727203987.69586: running the handler 25675 1727203987.69598: variable 'ansible_facts' from source: unknown 25675 1727203987.69604: variable 'ansible_facts' from source: unknown 25675 1727203987.69641: _low_level_execute_command(): starting 25675 1727203987.69654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203987.70382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203987.70404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203987.70433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203987.70536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.70559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203987.70588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.70604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.70713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.72408: stdout chunk (state=3): >>>/root <<< 25675 1727203987.72530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.72629: stderr chunk (state=3): >>><<< 25675 1727203987.72632: stdout chunk (state=3): >>><<< 25675 1727203987.72636: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.72646: _low_level_execute_command(): starting 25675 1727203987.72649: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548 `" && echo ansible-tmp-1727203987.725861-26236-46727734903548="` echo /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548 `" ) && sleep 0' 25675 1727203987.73286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203987.73289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203987.73292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.73364: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203987.73379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.73391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.73611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.75396: stdout chunk (state=3): >>>ansible-tmp-1727203987.725861-26236-46727734903548=/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548 <<< 25675 1727203987.75680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203987.75683: stdout chunk (state=3): >>><<< 25675 1727203987.75686: stderr chunk (state=3): >>><<< 25675 1727203987.75688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203987.725861-26236-46727734903548=/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203987.75690: variable 'ansible_module_compression' from source: unknown 25675 1727203987.75693: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 25675 1727203987.75695: ANSIBALLZ: Acquiring lock 25675 1727203987.75697: ANSIBALLZ: Lock acquired: 139822507557424 25675 1727203987.75699: ANSIBALLZ: Creating module 25675 1727203987.94211: ANSIBALLZ: Writing module into payload 25675 1727203987.94628: ANSIBALLZ: Writing module 25675 1727203987.94650: ANSIBALLZ: Renaming module 25675 1727203987.94663: ANSIBALLZ: Done creating module 25675 1727203987.94815: variable 'ansible_facts' from source: unknown 25675 1727203987.94991: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py 25675 1727203987.95179: Sending initial data 25675 1727203987.95183: Sent initial data (150 bytes) 25675 1727203987.96702: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203987.96820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203987.96858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203987.97022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203987.98628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25675 1727203987.98640: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203987.98758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203987.99003: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpr6y8wb4o /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py <<< 25675 1727203987.99011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py" <<< 25675 1727203987.99090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpr6y8wb4o" to remote "/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py" <<< 25675 1727203988.00956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.01109: stderr chunk (state=3): >>><<< 25675 1727203988.01118: stdout chunk (state=3): >>><<< 25675 1727203988.01286: done transferring module to remote 25675 1727203988.01303: _low_level_execute_command(): starting 25675 1727203988.01399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/ /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py && sleep 0' 25675 1727203988.03009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727203988.03028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203988.03031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.03169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.03245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.05118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.05231: stderr chunk (state=3): >>><<< 25675 1727203988.05235: stdout chunk (state=3): >>><<< 25675 1727203988.05300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.05304: _low_level_execute_command(): starting 25675 1727203988.05309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/AnsiballZ_dnf.py && sleep 0' 25675 1727203988.06983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.07094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.07537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.07541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.07664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.48633: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25675 1727203988.52756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203988.52784: stderr chunk (state=3): >>><<< 25675 1727203988.52787: stdout chunk (state=3): >>><<< 25675 1727203988.52803: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203988.52836: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203988.52845: _low_level_execute_command(): starting 25675 1727203988.52847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203987.725861-26236-46727734903548/ > /dev/null 2>&1 && sleep 0' 25675 1727203988.53318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.53321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203988.53323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.53325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203988.53327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.53374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203988.53380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.53382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.53461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.55295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.55319: stderr chunk (state=3): >>><<< 25675 1727203988.55322: stdout chunk (state=3): >>><<< 25675 1727203988.55336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.55342: handler run complete 25675 1727203988.55460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203988.55585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203988.55613: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203988.55634: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203988.55655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203988.55712: variable '__install_status' from source: unknown 25675 1727203988.55726: Evaluated conditional (__install_status is success): True 25675 1727203988.55737: attempt loop complete, returning result 25675 1727203988.55740: _execute() done 25675 1727203988.55742: dumping result to json 25675 1727203988.55748: done dumping result, returning 25675 1727203988.55755: done running TaskExecutor() for managed-node2/TASK: Install iproute [028d2410-947f-41bd-b19d-000000000134] 25675 1727203988.55759: sending task result for task 028d2410-947f-41bd-b19d-000000000134 25675 1727203988.55851: done sending task result for task 028d2410-947f-41bd-b19d-000000000134 25675 1727203988.55854: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25675 1727203988.55929: no more pending results, returning what we have 25675 1727203988.55933: results queue empty 25675 1727203988.55934: checking for any_errors_fatal 25675 1727203988.55938: done checking for any_errors_fatal 25675 1727203988.55939: checking for max_fail_percentage 25675 1727203988.55941: done checking for max_fail_percentage 25675 1727203988.55941: checking to see if all hosts have failed and the running result is not ok 25675 1727203988.55942: done checking to see if all hosts have failed 25675 1727203988.55943: getting the remaining hosts for this loop 25675 1727203988.55944: done getting the remaining hosts for this loop 25675 1727203988.55947: getting the next task for host managed-node2 25675 1727203988.55953: done getting next task for host managed-node2 25675 1727203988.55956: ^ task is: TASK: Create veth interface {{ interface }} 25675 1727203988.55958: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203988.55962: getting variables 25675 1727203988.55963: in VariableManager get_vars() 25675 1727203988.55992: Calling all_inventory to load vars for managed-node2 25675 1727203988.55995: Calling groups_inventory to load vars for managed-node2 25675 1727203988.55998: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203988.56008: Calling all_plugins_play to load vars for managed-node2 25675 1727203988.56010: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203988.56013: Calling groups_plugins_play to load vars for managed-node2 25675 1727203988.56393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203988.56503: done with get_vars() 25675 1727203988.56510: done getting variables 25675 1727203988.56551: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203988.56634: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:53:08 -0400 (0:00:00.959) 0:00:08.017 ***** 25675 1727203988.56654: entering _queue_task() for managed-node2/command 25675 1727203988.56855: worker is 1 (out of 1 available) 25675 1727203988.56868: exiting _queue_task() for managed-node2/command 25675 1727203988.56884: done queuing things up, now waiting for results queue to drain 25675 1727203988.56885: waiting for pending results... 25675 1727203988.57128: running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 25675 1727203988.57133: in run() - task 028d2410-947f-41bd-b19d-000000000135 25675 1727203988.57136: variable 'ansible_search_path' from source: unknown 25675 1727203988.57138: variable 'ansible_search_path' from source: unknown 25675 1727203988.57303: variable 'interface' from source: set_fact 25675 1727203988.57360: variable 'interface' from source: set_fact 25675 1727203988.57409: variable 'interface' from source: set_fact 25675 1727203988.57518: Loaded config def from plugin (lookup/items) 25675 1727203988.57522: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25675 1727203988.57542: variable 'omit' from source: magic vars 25675 1727203988.57628: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.57638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.57647: variable 'omit' from source: magic vars 25675 1727203988.57814: variable 'ansible_distribution_major_version' from source: facts 25675 1727203988.57819: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203988.57942: variable 'type' from source: set_fact 25675 1727203988.57945: variable 'state' from source: include params 25675 1727203988.57948: variable 'interface' from source: set_fact 25675 1727203988.57952: variable 'current_interfaces' from source: set_fact 25675 1727203988.57959: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25675 1727203988.57965: variable 'omit' from source: magic vars 25675 1727203988.57995: variable 'omit' from source: magic vars 25675 1727203988.58025: variable 'item' from source: unknown 25675 1727203988.58076: variable 'item' from source: unknown 25675 1727203988.58089: variable 'omit' from source: magic vars 25675 1727203988.58113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203988.58134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203988.58148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203988.58161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203988.58172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203988.58197: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203988.58202: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.58204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.58264: Set connection var ansible_shell_type to sh 25675 1727203988.58268: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203988.58274: Set connection var ansible_timeout to 10 25675 1727203988.58281: Set connection var ansible_pipelining to False 25675 1727203988.58286: Set connection var ansible_shell_executable to /bin/sh 25675 1727203988.58289: Set connection var ansible_connection to ssh 25675 1727203988.58305: variable 'ansible_shell_executable' from source: unknown 25675 1727203988.58308: variable 'ansible_connection' from source: unknown 25675 1727203988.58311: variable 'ansible_module_compression' from source: unknown 25675 1727203988.58315: variable 'ansible_shell_type' from source: unknown 25675 1727203988.58317: variable 'ansible_shell_executable' from source: unknown 25675 1727203988.58319: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.58322: variable 'ansible_pipelining' from source: unknown 25675 1727203988.58324: variable 'ansible_timeout' from source: unknown 25675 1727203988.58327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.58419: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203988.58427: variable 'omit' from source: magic vars 25675 1727203988.58433: starting attempt loop 25675 1727203988.58437: running the handler 25675 1727203988.58450: _low_level_execute_command(): starting 25675 1727203988.58457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203988.58962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203988.58966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.58969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203988.58971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203988.59026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203988.59029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.59031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.59111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.60758: stdout chunk (state=3): >>>/root <<< 25675 1727203988.60859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.60889: stderr chunk (state=3): >>><<< 25675 1727203988.60892: stdout chunk (state=3): >>><<< 25675 1727203988.60914: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.60925: _low_level_execute_command(): starting 25675 1727203988.60936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024 `" && echo ansible-tmp-1727203988.6091318-26350-126065848754024="` echo /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024 `" ) && sleep 0' 25675 1727203988.61592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203988.61597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.61600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.61651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.63572: stdout chunk (state=3): >>>ansible-tmp-1727203988.6091318-26350-126065848754024=/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024 <<< 25675 1727203988.63680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.63715: stderr chunk (state=3): >>><<< 25675 1727203988.63718: stdout chunk (state=3): >>><<< 25675 1727203988.63734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203988.6091318-26350-126065848754024=/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.63756: variable 'ansible_module_compression' from source: unknown 25675 1727203988.63800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727203988.63829: variable 'ansible_facts' from source: unknown 25675 1727203988.63885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py 25675 1727203988.63977: Sending initial data 25675 1727203988.63980: Sent initial data (156 bytes) 25675 1727203988.64655: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.64897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.66335: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25675 1727203988.66339: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203988.66404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203988.66518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpl7tj38l6 /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py <<< 25675 1727203988.66522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py" <<< 25675 1727203988.66628: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpl7tj38l6" to remote "/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py" <<< 25675 1727203988.67607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.67610: stdout chunk (state=3): >>><<< 25675 1727203988.67613: stderr chunk (state=3): >>><<< 25675 1727203988.67615: done transferring module to remote 25675 1727203988.67617: _low_level_execute_command(): starting 25675 1727203988.67619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/ /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py && sleep 0' 25675 1727203988.68177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.68302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203988.68379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.68410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.68516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.70338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.70347: stdout chunk (state=3): >>><<< 25675 1727203988.70364: stderr chunk (state=3): >>><<< 25675 1727203988.70390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.70399: _low_level_execute_command(): starting 25675 1727203988.70467: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/AnsiballZ_command.py && sleep 0' 25675 1727203988.70981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.70995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203988.71007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203988.71030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203988.71045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203988.71141: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203988.71154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.71178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.71284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.87602: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:53:08.863338", "end": "2024-09-24 14:53:08.870022", "delta": "0:00:00.006684", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203988.89774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203988.90011: stdout chunk (state=3): >>><<< 25675 1727203988.90015: stderr chunk (state=3): >>><<< 25675 1727203988.90018: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:53:08.863338", "end": "2024-09-24 14:53:08.870022", "delta": "0:00:00.006684", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203988.90021: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203988.90024: _low_level_execute_command(): starting 25675 1727203988.90026: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203988.6091318-26350-126065848754024/ > /dev/null 2>&1 && sleep 0' 25675 1727203988.91016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.91032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203988.91047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203988.91067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203988.91107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727203988.91124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203988.91217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.91241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.91426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203988.95146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203988.95211: stderr chunk (state=3): >>><<< 25675 1727203988.95219: stdout chunk (state=3): >>><<< 25675 1727203988.95240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203988.95250: handler run complete 25675 1727203988.95282: Evaluated conditional (False): False 25675 1727203988.95298: attempt loop complete, returning result 25675 1727203988.95384: variable 'item' from source: unknown 25675 1727203988.95448: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.006684", "end": "2024-09-24 14:53:08.870022", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-24 14:53:08.863338" } 25675 1727203988.96097: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.96101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.96103: variable 'omit' from source: magic vars 25675 1727203988.96153: variable 'ansible_distribution_major_version' from source: facts 25675 1727203988.96166: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203988.96381: variable 'type' from source: set_fact 25675 1727203988.96392: variable 'state' from source: include params 25675 1727203988.96401: variable 'interface' from source: set_fact 25675 1727203988.96410: variable 'current_interfaces' from source: set_fact 25675 1727203988.96424: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25675 1727203988.96433: variable 'omit' from source: magic vars 25675 1727203988.96452: variable 'omit' from source: magic vars 25675 1727203988.96501: variable 'item' from source: unknown 25675 1727203988.96573: variable 'item' from source: unknown 25675 1727203988.96880: variable 'omit' from source: magic vars 25675 1727203988.96883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203988.96885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203988.96888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203988.96890: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203988.96892: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.96894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.96896: Set connection var ansible_shell_type to sh 25675 1727203988.96898: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203988.96900: Set connection var ansible_timeout to 10 25675 1727203988.96901: Set connection var ansible_pipelining to False 25675 1727203988.96989: Set connection var ansible_shell_executable to /bin/sh 25675 1727203988.96997: Set connection var ansible_connection to ssh 25675 1727203988.97234: variable 'ansible_shell_executable' from source: unknown 25675 1727203988.97237: variable 'ansible_connection' from source: unknown 25675 1727203988.97239: variable 'ansible_module_compression' from source: unknown 25675 1727203988.97241: variable 'ansible_shell_type' from source: unknown 25675 1727203988.97243: variable 'ansible_shell_executable' from source: unknown 25675 1727203988.97245: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203988.97247: variable 'ansible_pipelining' from source: unknown 25675 1727203988.97249: variable 'ansible_timeout' from source: unknown 25675 1727203988.97251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203988.97384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203988.97402: variable 'omit' from source: magic vars 25675 1727203988.97411: starting attempt loop 25675 1727203988.97418: running the handler 25675 1727203988.97431: _low_level_execute_command(): starting 25675 1727203988.97439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203988.98245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203988.98263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203988.98372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203988.98396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203988.98505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.00158: stdout chunk (state=3): >>>/root <<< 25675 1727203989.00302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.00326: stderr chunk (state=3): >>><<< 25675 1727203989.00335: stdout chunk (state=3): >>><<< 25675 1727203989.00356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.00440: _low_level_execute_command(): starting 25675 1727203989.00444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412 `" && echo ansible-tmp-1727203989.0036108-26350-135850545964412="` echo /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412 `" ) && sleep 0' 25675 1727203989.00990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.01006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.01023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.01041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203989.01108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.01156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.01171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.01194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.01301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.03203: stdout chunk (state=3): >>>ansible-tmp-1727203989.0036108-26350-135850545964412=/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412 <<< 25675 1727203989.03482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.03485: stdout chunk (state=3): >>><<< 25675 1727203989.03487: stderr chunk (state=3): >>><<< 25675 1727203989.03489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.0036108-26350-135850545964412=/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.03491: variable 'ansible_module_compression' from source: unknown 25675 1727203989.03493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727203989.03495: variable 'ansible_facts' from source: unknown 25675 1727203989.03735: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py 25675 1727203989.04140: Sending initial data 25675 1727203989.04143: Sent initial data (156 bytes) 25675 1727203989.05587: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.05598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.05640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.05780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.07399: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203989.07468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203989.07535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmplwh09y5b /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py <<< 25675 1727203989.07542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py" <<< 25675 1727203989.07619: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmplwh09y5b" to remote "/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py" <<< 25675 1727203989.08479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.08526: stderr chunk (state=3): >>><<< 25675 1727203989.08532: stdout chunk (state=3): >>><<< 25675 1727203989.08632: done transferring module to remote 25675 1727203989.08636: _low_level_execute_command(): starting 25675 1727203989.08640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/ /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py && sleep 0' 25675 1727203989.09014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.09017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203989.09019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.09022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.09024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.09072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.09083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.09153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.10981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.10984: stdout chunk (state=3): >>><<< 25675 1727203989.10989: stderr chunk (state=3): >>><<< 25675 1727203989.11007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.11084: _low_level_execute_command(): starting 25675 1727203989.11088: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/AnsiballZ_command.py && sleep 0' 25675 1727203989.11551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.11554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203989.11556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.11558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.11560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203989.11562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.11611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.11614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.11697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.27326: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:53:09.268410", "end": "2024-09-24 14:53:09.272211", "delta": "0:00:00.003801", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203989.28879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203989.28892: stderr chunk (state=3): >>><<< 25675 1727203989.28895: stdout chunk (state=3): >>><<< 25675 1727203989.28911: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:53:09.268410", "end": "2024-09-24 14:53:09.272211", "delta": "0:00:00.003801", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203989.28938: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203989.28944: _low_level_execute_command(): starting 25675 1727203989.28949: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.0036108-26350-135850545964412/ > /dev/null 2>&1 && sleep 0' 25675 1727203989.29391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.29396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.29399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.29401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.29402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.29452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.29456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.29460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.29530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.31380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.31403: stderr chunk (state=3): >>><<< 25675 1727203989.31406: stdout chunk (state=3): >>><<< 25675 1727203989.31421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.31426: handler run complete 25675 1727203989.31443: Evaluated conditional (False): False 25675 1727203989.31450: attempt loop complete, returning result 25675 1727203989.31465: variable 'item' from source: unknown 25675 1727203989.31529: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003801", "end": "2024-09-24 14:53:09.272211", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-24 14:53:09.268410" } 25675 1727203989.31642: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.31647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.31650: variable 'omit' from source: magic vars 25675 1727203989.31748: variable 'ansible_distribution_major_version' from source: facts 25675 1727203989.31751: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203989.31871: variable 'type' from source: set_fact 25675 1727203989.31882: variable 'state' from source: include params 25675 1727203989.31885: variable 'interface' from source: set_fact 25675 1727203989.31887: variable 'current_interfaces' from source: set_fact 25675 1727203989.31892: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25675 1727203989.31896: variable 'omit' from source: magic vars 25675 1727203989.31907: variable 'omit' from source: magic vars 25675 1727203989.31931: variable 'item' from source: unknown 25675 1727203989.31976: variable 'item' from source: unknown 25675 1727203989.31990: variable 'omit' from source: magic vars 25675 1727203989.32006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203989.32013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203989.32018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203989.32029: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203989.32031: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.32034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.32082: Set connection var ansible_shell_type to sh 25675 1727203989.32087: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203989.32095: Set connection var ansible_timeout to 10 25675 1727203989.32097: Set connection var ansible_pipelining to False 25675 1727203989.32101: Set connection var ansible_shell_executable to /bin/sh 25675 1727203989.32104: Set connection var ansible_connection to ssh 25675 1727203989.32120: variable 'ansible_shell_executable' from source: unknown 25675 1727203989.32123: variable 'ansible_connection' from source: unknown 25675 1727203989.32126: variable 'ansible_module_compression' from source: unknown 25675 1727203989.32128: variable 'ansible_shell_type' from source: unknown 25675 1727203989.32130: variable 'ansible_shell_executable' from source: unknown 25675 1727203989.32132: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.32136: variable 'ansible_pipelining' from source: unknown 25675 1727203989.32138: variable 'ansible_timeout' from source: unknown 25675 1727203989.32142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.32209: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203989.32217: variable 'omit' from source: magic vars 25675 1727203989.32219: starting attempt loop 25675 1727203989.32222: running the handler 25675 1727203989.32228: _low_level_execute_command(): starting 25675 1727203989.32230: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203989.32673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.32679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.32681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727203989.32683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.32685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.32731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.32737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.32739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.32807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.34426: stdout chunk (state=3): >>>/root <<< 25675 1727203989.34638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.34641: stdout chunk (state=3): >>><<< 25675 1727203989.34643: stderr chunk (state=3): >>><<< 25675 1727203989.34645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.34647: _low_level_execute_command(): starting 25675 1727203989.34649: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556 `" && echo ansible-tmp-1727203989.3457005-26350-177744152293556="` echo /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556 `" ) && sleep 0' 25675 1727203989.35132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.35143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.35228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.35248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.35261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.35280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.35374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.37252: stdout chunk (state=3): >>>ansible-tmp-1727203989.3457005-26350-177744152293556=/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556 <<< 25675 1727203989.37359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.37390: stderr chunk (state=3): >>><<< 25675 1727203989.37393: stdout chunk (state=3): >>><<< 25675 1727203989.37407: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.3457005-26350-177744152293556=/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.37422: variable 'ansible_module_compression' from source: unknown 25675 1727203989.37448: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727203989.37463: variable 'ansible_facts' from source: unknown 25675 1727203989.37510: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py 25675 1727203989.37596: Sending initial data 25675 1727203989.37599: Sent initial data (156 bytes) 25675 1727203989.38019: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.38022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203989.38028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.38030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.38032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.38079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.38083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.38161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.39764: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25675 1727203989.39767: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203989.39829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203989.39899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5h9vo1li /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py <<< 25675 1727203989.39905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py" <<< 25675 1727203989.39969: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5h9vo1li" to remote "/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py" <<< 25675 1727203989.39977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py" <<< 25675 1727203989.40602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.40642: stderr chunk (state=3): >>><<< 25675 1727203989.40646: stdout chunk (state=3): >>><<< 25675 1727203989.40685: done transferring module to remote 25675 1727203989.40691: _low_level_execute_command(): starting 25675 1727203989.40696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/ /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py && sleep 0' 25675 1727203989.41235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.41429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.43282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.43286: stdout chunk (state=3): >>><<< 25675 1727203989.43288: stderr chunk (state=3): >>><<< 25675 1727203989.43381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.43385: _low_level_execute_command(): starting 25675 1727203989.43387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/AnsiballZ_command.py && sleep 0' 25675 1727203989.43930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.43946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.43990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203989.44004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.44090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.44116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.44141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.44250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.59964: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:53:09.592511", "end": "2024-09-24 14:53:09.596462", "delta": "0:00:00.003951", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203989.61898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203989.61904: stdout chunk (state=3): >>><<< 25675 1727203989.61907: stderr chunk (state=3): >>><<< 25675 1727203989.61909: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:53:09.592511", "end": "2024-09-24 14:53:09.596462", "delta": "0:00:00.003951", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203989.61912: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203989.61914: _low_level_execute_command(): starting 25675 1727203989.61916: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.3457005-26350-177744152293556/ > /dev/null 2>&1 && sleep 0' 25675 1727203989.62934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.62948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.63210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.63336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.63387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.63552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.65495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.65550: stderr chunk (state=3): >>><<< 25675 1727203989.65589: stdout chunk (state=3): >>><<< 25675 1727203989.65593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.65596: handler run complete 25675 1727203989.65598: Evaluated conditional (False): False 25675 1727203989.65636: attempt loop complete, returning result 25675 1727203989.65639: variable 'item' from source: unknown 25675 1727203989.65885: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003951", "end": "2024-09-24 14:53:09.596462", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-24 14:53:09.592511" } 25675 1727203989.66007: dumping result to json 25675 1727203989.66129: done dumping result, returning 25675 1727203989.66132: done running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 [028d2410-947f-41bd-b19d-000000000135] 25675 1727203989.66135: sending task result for task 028d2410-947f-41bd-b19d-000000000135 25675 1727203989.66407: done sending task result for task 028d2410-947f-41bd-b19d-000000000135 25675 1727203989.66411: WORKER PROCESS EXITING 25675 1727203989.66520: no more pending results, returning what we have 25675 1727203989.66524: results queue empty 25675 1727203989.66524: checking for any_errors_fatal 25675 1727203989.66532: done checking for any_errors_fatal 25675 1727203989.66533: checking for max_fail_percentage 25675 1727203989.66534: done checking for max_fail_percentage 25675 1727203989.66535: checking to see if all hosts have failed and the running result is not ok 25675 1727203989.66539: done checking to see if all hosts have failed 25675 1727203989.66540: getting the remaining hosts for this loop 25675 1727203989.66541: done getting the remaining hosts for this loop 25675 1727203989.66545: getting the next task for host managed-node2 25675 1727203989.66551: done getting next task for host managed-node2 25675 1727203989.66554: ^ task is: TASK: Set up veth as managed by NetworkManager 25675 1727203989.66557: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203989.66561: getting variables 25675 1727203989.66562: in VariableManager get_vars() 25675 1727203989.66596: Calling all_inventory to load vars for managed-node2 25675 1727203989.66599: Calling groups_inventory to load vars for managed-node2 25675 1727203989.66602: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203989.66613: Calling all_plugins_play to load vars for managed-node2 25675 1727203989.66616: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203989.66619: Calling groups_plugins_play to load vars for managed-node2 25675 1727203989.67193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203989.67604: done with get_vars() 25675 1727203989.67614: done getting variables 25675 1727203989.67673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:53:09 -0400 (0:00:01.111) 0:00:09.129 ***** 25675 1727203989.67806: entering _queue_task() for managed-node2/command 25675 1727203989.68369: worker is 1 (out of 1 available) 25675 1727203989.68386: exiting _queue_task() for managed-node2/command 25675 1727203989.68399: done queuing things up, now waiting for results queue to drain 25675 1727203989.68400: waiting for pending results... 25675 1727203989.68993: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 25675 1727203989.69119: in run() - task 028d2410-947f-41bd-b19d-000000000136 25675 1727203989.69141: variable 'ansible_search_path' from source: unknown 25675 1727203989.69186: variable 'ansible_search_path' from source: unknown 25675 1727203989.69552: calling self._execute() 25675 1727203989.69556: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.69559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.69567: variable 'omit' from source: magic vars 25675 1727203989.70316: variable 'ansible_distribution_major_version' from source: facts 25675 1727203989.70334: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203989.70610: variable 'type' from source: set_fact 25675 1727203989.70688: variable 'state' from source: include params 25675 1727203989.70699: Evaluated conditional (type == 'veth' and state == 'present'): True 25675 1727203989.70761: variable 'omit' from source: magic vars 25675 1727203989.70814: variable 'omit' from source: magic vars 25675 1727203989.70957: variable 'interface' from source: set_fact 25675 1727203989.71137: variable 'omit' from source: magic vars 25675 1727203989.71483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203989.71487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203989.71489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203989.71492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203989.71494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203989.71596: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203989.71606: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.71614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.71834: Set connection var ansible_shell_type to sh 25675 1727203989.71845: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203989.71855: Set connection var ansible_timeout to 10 25675 1727203989.71927: Set connection var ansible_pipelining to False 25675 1727203989.71938: Set connection var ansible_shell_executable to /bin/sh 25675 1727203989.71945: Set connection var ansible_connection to ssh 25675 1727203989.71982: variable 'ansible_shell_executable' from source: unknown 25675 1727203989.71991: variable 'ansible_connection' from source: unknown 25675 1727203989.72061: variable 'ansible_module_compression' from source: unknown 25675 1727203989.72069: variable 'ansible_shell_type' from source: unknown 25675 1727203989.72082: variable 'ansible_shell_executable' from source: unknown 25675 1727203989.72090: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203989.72098: variable 'ansible_pipelining' from source: unknown 25675 1727203989.72142: variable 'ansible_timeout' from source: unknown 25675 1727203989.72151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203989.72573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203989.72579: variable 'omit' from source: magic vars 25675 1727203989.72581: starting attempt loop 25675 1727203989.72583: running the handler 25675 1727203989.72586: _low_level_execute_command(): starting 25675 1727203989.72588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203989.73659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.73680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.73700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.73784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.73834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.73861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.73890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.74216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.76022: stdout chunk (state=3): >>>/root <<< 25675 1727203989.76083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.76103: stderr chunk (state=3): >>><<< 25675 1727203989.76113: stdout chunk (state=3): >>><<< 25675 1727203989.76583: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.76587: _low_level_execute_command(): starting 25675 1727203989.76590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611 `" && echo ansible-tmp-1727203989.7629797-26511-131837199894611="` echo /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611 `" ) && sleep 0' 25675 1727203989.77812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.78096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.78141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.78404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.80383: stdout chunk (state=3): >>>ansible-tmp-1727203989.7629797-26511-131837199894611=/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611 <<< 25675 1727203989.80533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.80599: stdout chunk (state=3): >>><<< 25675 1727203989.80612: stderr chunk (state=3): >>><<< 25675 1727203989.80636: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.7629797-26511-131837199894611=/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.80678: variable 'ansible_module_compression' from source: unknown 25675 1727203989.80832: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727203989.80880: variable 'ansible_facts' from source: unknown 25675 1727203989.81039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py 25675 1727203989.81501: Sending initial data 25675 1727203989.81504: Sent initial data (156 bytes) 25675 1727203989.82529: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.82545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203989.82563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203989.82590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203989.82692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203989.82716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.82744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.82769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.83224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.84515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25675 1727203989.84519: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203989.84580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203989.84855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp61w742_6 /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py <<< 25675 1727203989.84860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp61w742_6" to remote "/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py" <<< 25675 1727203989.86914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.86986: stderr chunk (state=3): >>><<< 25675 1727203989.86995: stdout chunk (state=3): >>><<< 25675 1727203989.87023: done transferring module to remote 25675 1727203989.87095: _low_level_execute_command(): starting 25675 1727203989.87105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/ /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py && sleep 0' 25675 1727203989.88200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.88480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.88501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.88516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.88620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203989.90548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203989.90560: stdout chunk (state=3): >>><<< 25675 1727203989.90582: stderr chunk (state=3): >>><<< 25675 1727203989.90608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203989.90687: _low_level_execute_command(): starting 25675 1727203989.90697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/AnsiballZ_command.py && sleep 0' 25675 1727203989.91688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203989.91978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203989.91997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203989.92010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203989.92293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.09498: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:53:10.072522", "end": "2024-09-24 14:53:10.091032", "delta": "0:00:00.018510", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727203990.10808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203990.10812: stdout chunk (state=3): >>><<< 25675 1727203990.10815: stderr chunk (state=3): >>><<< 25675 1727203990.10947: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:53:10.072522", "end": "2024-09-24 14:53:10.091032", "delta": "0:00:00.018510", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203990.10952: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203990.10954: _low_level_execute_command(): starting 25675 1727203990.10956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.7629797-26511-131837199894611/ > /dev/null 2>&1 && sleep 0' 25675 1727203990.12026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203990.12037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203990.12052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203990.12066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203990.12263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203990.12406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.12602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.14493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.14555: stderr chunk (state=3): >>><<< 25675 1727203990.14564: stdout chunk (state=3): >>><<< 25675 1727203990.14591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203990.14681: handler run complete 25675 1727203990.14689: Evaluated conditional (False): False 25675 1727203990.14705: attempt loop complete, returning result 25675 1727203990.14981: _execute() done 25675 1727203990.14985: dumping result to json 25675 1727203990.14988: done dumping result, returning 25675 1727203990.14990: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [028d2410-947f-41bd-b19d-000000000136] 25675 1727203990.14992: sending task result for task 028d2410-947f-41bd-b19d-000000000136 25675 1727203990.15064: done sending task result for task 028d2410-947f-41bd-b19d-000000000136 25675 1727203990.15068: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.018510", "end": "2024-09-24 14:53:10.091032", "rc": 0, "start": "2024-09-24 14:53:10.072522" } 25675 1727203990.15131: no more pending results, returning what we have 25675 1727203990.15134: results queue empty 25675 1727203990.15135: checking for any_errors_fatal 25675 1727203990.15144: done checking for any_errors_fatal 25675 1727203990.15145: checking for max_fail_percentage 25675 1727203990.15147: done checking for max_fail_percentage 25675 1727203990.15147: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.15148: done checking to see if all hosts have failed 25675 1727203990.15149: getting the remaining hosts for this loop 25675 1727203990.15150: done getting the remaining hosts for this loop 25675 1727203990.15154: getting the next task for host managed-node2 25675 1727203990.15160: done getting next task for host managed-node2 25675 1727203990.15162: ^ task is: TASK: Delete veth interface {{ interface }} 25675 1727203990.15165: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.15169: getting variables 25675 1727203990.15171: in VariableManager get_vars() 25675 1727203990.15203: Calling all_inventory to load vars for managed-node2 25675 1727203990.15206: Calling groups_inventory to load vars for managed-node2 25675 1727203990.15209: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.15220: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.15223: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.15225: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.15597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.15996: done with get_vars() 25675 1727203990.16007: done getting variables 25675 1727203990.16061: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203990.16380: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.485) 0:00:09.615 ***** 25675 1727203990.16409: entering _queue_task() for managed-node2/command 25675 1727203990.17109: worker is 1 (out of 1 available) 25675 1727203990.17123: exiting _queue_task() for managed-node2/command 25675 1727203990.17134: done queuing things up, now waiting for results queue to drain 25675 1727203990.17135: waiting for pending results... 25675 1727203990.17372: running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 25675 1727203990.17784: in run() - task 028d2410-947f-41bd-b19d-000000000137 25675 1727203990.17788: variable 'ansible_search_path' from source: unknown 25675 1727203990.17791: variable 'ansible_search_path' from source: unknown 25675 1727203990.17794: calling self._execute() 25675 1727203990.17857: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.17900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.17918: variable 'omit' from source: magic vars 25675 1727203990.18651: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.18874: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.19280: variable 'type' from source: set_fact 25675 1727203990.19284: variable 'state' from source: include params 25675 1727203990.19286: variable 'interface' from source: set_fact 25675 1727203990.19289: variable 'current_interfaces' from source: set_fact 25675 1727203990.19292: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 25675 1727203990.19294: when evaluation is False, skipping this task 25675 1727203990.19296: _execute() done 25675 1727203990.19299: dumping result to json 25675 1727203990.19303: done dumping result, returning 25675 1727203990.19305: done running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 [028d2410-947f-41bd-b19d-000000000137] 25675 1727203990.19307: sending task result for task 028d2410-947f-41bd-b19d-000000000137 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25675 1727203990.19430: no more pending results, returning what we have 25675 1727203990.19435: results queue empty 25675 1727203990.19436: checking for any_errors_fatal 25675 1727203990.19444: done checking for any_errors_fatal 25675 1727203990.19445: checking for max_fail_percentage 25675 1727203990.19447: done checking for max_fail_percentage 25675 1727203990.19448: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.19449: done checking to see if all hosts have failed 25675 1727203990.19450: getting the remaining hosts for this loop 25675 1727203990.19451: done getting the remaining hosts for this loop 25675 1727203990.19455: getting the next task for host managed-node2 25675 1727203990.19461: done getting next task for host managed-node2 25675 1727203990.19465: ^ task is: TASK: Create dummy interface {{ interface }} 25675 1727203990.19469: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.19473: getting variables 25675 1727203990.19477: in VariableManager get_vars() 25675 1727203990.19508: Calling all_inventory to load vars for managed-node2 25675 1727203990.19511: Calling groups_inventory to load vars for managed-node2 25675 1727203990.19514: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.19526: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.19528: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.19530: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.20100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.20335: done with get_vars() 25675 1727203990.20346: done getting variables 25675 1727203990.20580: done sending task result for task 028d2410-947f-41bd-b19d-000000000137 25675 1727203990.20584: WORKER PROCESS EXITING 25675 1727203990.20620: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203990.20726: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.043) 0:00:09.658 ***** 25675 1727203990.20755: entering _queue_task() for managed-node2/command 25675 1727203990.21414: worker is 1 (out of 1 available) 25675 1727203990.21424: exiting _queue_task() for managed-node2/command 25675 1727203990.21434: done queuing things up, now waiting for results queue to drain 25675 1727203990.21435: waiting for pending results... 25675 1727203990.21715: running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 25675 1727203990.21926: in run() - task 028d2410-947f-41bd-b19d-000000000138 25675 1727203990.21948: variable 'ansible_search_path' from source: unknown 25675 1727203990.21957: variable 'ansible_search_path' from source: unknown 25675 1727203990.22021: calling self._execute() 25675 1727203990.22189: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.22290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.22306: variable 'omit' from source: magic vars 25675 1727203990.23022: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.23042: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.23501: variable 'type' from source: set_fact 25675 1727203990.23512: variable 'state' from source: include params 25675 1727203990.23521: variable 'interface' from source: set_fact 25675 1727203990.23536: variable 'current_interfaces' from source: set_fact 25675 1727203990.23550: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25675 1727203990.23586: when evaluation is False, skipping this task 25675 1727203990.23594: _execute() done 25675 1727203990.23602: dumping result to json 25675 1727203990.23611: done dumping result, returning 25675 1727203990.23651: done running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 [028d2410-947f-41bd-b19d-000000000138] 25675 1727203990.23663: sending task result for task 028d2410-947f-41bd-b19d-000000000138 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25675 1727203990.23920: no more pending results, returning what we have 25675 1727203990.23924: results queue empty 25675 1727203990.23925: checking for any_errors_fatal 25675 1727203990.23933: done checking for any_errors_fatal 25675 1727203990.23934: checking for max_fail_percentage 25675 1727203990.23935: done checking for max_fail_percentage 25675 1727203990.23936: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.23937: done checking to see if all hosts have failed 25675 1727203990.23938: getting the remaining hosts for this loop 25675 1727203990.23939: done getting the remaining hosts for this loop 25675 1727203990.23942: getting the next task for host managed-node2 25675 1727203990.23948: done getting next task for host managed-node2 25675 1727203990.23950: ^ task is: TASK: Delete dummy interface {{ interface }} 25675 1727203990.23953: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.23957: getting variables 25675 1727203990.23959: in VariableManager get_vars() 25675 1727203990.23989: Calling all_inventory to load vars for managed-node2 25675 1727203990.23992: Calling groups_inventory to load vars for managed-node2 25675 1727203990.23996: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.24009: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.24012: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.24015: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.24381: done sending task result for task 028d2410-947f-41bd-b19d-000000000138 25675 1727203990.24384: WORKER PROCESS EXITING 25675 1727203990.24594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.24993: done with get_vars() 25675 1727203990.25004: done getting variables 25675 1727203990.25059: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203990.25170: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.046) 0:00:09.705 ***** 25675 1727203990.25403: entering _queue_task() for managed-node2/command 25675 1727203990.25869: worker is 1 (out of 1 available) 25675 1727203990.25883: exiting _queue_task() for managed-node2/command 25675 1727203990.25898: done queuing things up, now waiting for results queue to drain 25675 1727203990.25899: waiting for pending results... 25675 1727203990.26383: running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 25675 1727203990.26445: in run() - task 028d2410-947f-41bd-b19d-000000000139 25675 1727203990.26599: variable 'ansible_search_path' from source: unknown 25675 1727203990.26607: variable 'ansible_search_path' from source: unknown 25675 1727203990.26645: calling self._execute() 25675 1727203990.26826: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.26839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.26854: variable 'omit' from source: magic vars 25675 1727203990.27530: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.27687: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.28100: variable 'type' from source: set_fact 25675 1727203990.28111: variable 'state' from source: include params 25675 1727203990.28119: variable 'interface' from source: set_fact 25675 1727203990.28126: variable 'current_interfaces' from source: set_fact 25675 1727203990.28138: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25675 1727203990.28145: when evaluation is False, skipping this task 25675 1727203990.28151: _execute() done 25675 1727203990.28158: dumping result to json 25675 1727203990.28165: done dumping result, returning 25675 1727203990.28179: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 [028d2410-947f-41bd-b19d-000000000139] 25675 1727203990.28214: sending task result for task 028d2410-947f-41bd-b19d-000000000139 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25675 1727203990.28352: no more pending results, returning what we have 25675 1727203990.28357: results queue empty 25675 1727203990.28358: checking for any_errors_fatal 25675 1727203990.28366: done checking for any_errors_fatal 25675 1727203990.28366: checking for max_fail_percentage 25675 1727203990.28368: done checking for max_fail_percentage 25675 1727203990.28369: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.28370: done checking to see if all hosts have failed 25675 1727203990.28370: getting the remaining hosts for this loop 25675 1727203990.28372: done getting the remaining hosts for this loop 25675 1727203990.28377: getting the next task for host managed-node2 25675 1727203990.28384: done getting next task for host managed-node2 25675 1727203990.28386: ^ task is: TASK: Create tap interface {{ interface }} 25675 1727203990.28390: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.28395: getting variables 25675 1727203990.28396: in VariableManager get_vars() 25675 1727203990.28425: Calling all_inventory to load vars for managed-node2 25675 1727203990.28427: Calling groups_inventory to load vars for managed-node2 25675 1727203990.28430: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.28443: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.28445: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.28447: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.29083: done sending task result for task 028d2410-947f-41bd-b19d-000000000139 25675 1727203990.29087: WORKER PROCESS EXITING 25675 1727203990.29116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.29508: done with get_vars() 25675 1727203990.29518: done getting variables 25675 1727203990.29572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203990.29878: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.045) 0:00:09.750 ***** 25675 1727203990.29909: entering _queue_task() for managed-node2/command 25675 1727203990.30448: worker is 1 (out of 1 available) 25675 1727203990.30460: exiting _queue_task() for managed-node2/command 25675 1727203990.30473: done queuing things up, now waiting for results queue to drain 25675 1727203990.30474: waiting for pending results... 25675 1727203990.31024: running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 25675 1727203990.31488: in run() - task 028d2410-947f-41bd-b19d-00000000013a 25675 1727203990.31687: variable 'ansible_search_path' from source: unknown 25675 1727203990.31692: variable 'ansible_search_path' from source: unknown 25675 1727203990.31696: calling self._execute() 25675 1727203990.31809: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.31981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.31984: variable 'omit' from source: magic vars 25675 1727203990.32743: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.32760: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.33161: variable 'type' from source: set_fact 25675 1727203990.33357: variable 'state' from source: include params 25675 1727203990.33360: variable 'interface' from source: set_fact 25675 1727203990.33363: variable 'current_interfaces' from source: set_fact 25675 1727203990.33366: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25675 1727203990.33368: when evaluation is False, skipping this task 25675 1727203990.33373: _execute() done 25675 1727203990.33378: dumping result to json 25675 1727203990.33380: done dumping result, returning 25675 1727203990.33382: done running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 [028d2410-947f-41bd-b19d-00000000013a] 25675 1727203990.33384: sending task result for task 028d2410-947f-41bd-b19d-00000000013a skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25675 1727203990.33515: no more pending results, returning what we have 25675 1727203990.33519: results queue empty 25675 1727203990.33521: checking for any_errors_fatal 25675 1727203990.33528: done checking for any_errors_fatal 25675 1727203990.33529: checking for max_fail_percentage 25675 1727203990.33530: done checking for max_fail_percentage 25675 1727203990.33531: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.33532: done checking to see if all hosts have failed 25675 1727203990.33533: getting the remaining hosts for this loop 25675 1727203990.33534: done getting the remaining hosts for this loop 25675 1727203990.33538: getting the next task for host managed-node2 25675 1727203990.33545: done getting next task for host managed-node2 25675 1727203990.33547: ^ task is: TASK: Delete tap interface {{ interface }} 25675 1727203990.33551: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.33555: getting variables 25675 1727203990.33557: in VariableManager get_vars() 25675 1727203990.33801: Calling all_inventory to load vars for managed-node2 25675 1727203990.33804: Calling groups_inventory to load vars for managed-node2 25675 1727203990.33808: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.33821: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.33824: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.33827: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.34377: done sending task result for task 028d2410-947f-41bd-b19d-00000000013a 25675 1727203990.34381: WORKER PROCESS EXITING 25675 1727203990.34403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.34743: done with get_vars() 25675 1727203990.34753: done getting variables 25675 1727203990.35214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727203990.35532: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.056) 0:00:09.806 ***** 25675 1727203990.35562: entering _queue_task() for managed-node2/command 25675 1727203990.36047: worker is 1 (out of 1 available) 25675 1727203990.36060: exiting _queue_task() for managed-node2/command 25675 1727203990.36074: done queuing things up, now waiting for results queue to drain 25675 1727203990.36278: waiting for pending results... 25675 1727203990.36915: running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 25675 1727203990.36994: in run() - task 028d2410-947f-41bd-b19d-00000000013b 25675 1727203990.37007: variable 'ansible_search_path' from source: unknown 25675 1727203990.37011: variable 'ansible_search_path' from source: unknown 25675 1727203990.37046: calling self._execute() 25675 1727203990.37530: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.37537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.37547: variable 'omit' from source: magic vars 25675 1727203990.38685: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.38696: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.39287: variable 'type' from source: set_fact 25675 1727203990.39293: variable 'state' from source: include params 25675 1727203990.39296: variable 'interface' from source: set_fact 25675 1727203990.39343: variable 'current_interfaces' from source: set_fact 25675 1727203990.39347: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25675 1727203990.39350: when evaluation is False, skipping this task 25675 1727203990.39352: _execute() done 25675 1727203990.39355: dumping result to json 25675 1727203990.39357: done dumping result, returning 25675 1727203990.39360: done running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 [028d2410-947f-41bd-b19d-00000000013b] 25675 1727203990.39362: sending task result for task 028d2410-947f-41bd-b19d-00000000013b 25675 1727203990.39425: done sending task result for task 028d2410-947f-41bd-b19d-00000000013b 25675 1727203990.39427: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25675 1727203990.39497: no more pending results, returning what we have 25675 1727203990.39502: results queue empty 25675 1727203990.39503: checking for any_errors_fatal 25675 1727203990.39511: done checking for any_errors_fatal 25675 1727203990.39511: checking for max_fail_percentage 25675 1727203990.39513: done checking for max_fail_percentage 25675 1727203990.39514: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.39515: done checking to see if all hosts have failed 25675 1727203990.39515: getting the remaining hosts for this loop 25675 1727203990.39517: done getting the remaining hosts for this loop 25675 1727203990.39521: getting the next task for host managed-node2 25675 1727203990.39529: done getting next task for host managed-node2 25675 1727203990.39533: ^ task is: TASK: Include the task 'assert_device_present.yml' 25675 1727203990.39536: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.39540: getting variables 25675 1727203990.39542: in VariableManager get_vars() 25675 1727203990.39574: Calling all_inventory to load vars for managed-node2 25675 1727203990.39579: Calling groups_inventory to load vars for managed-node2 25675 1727203990.39583: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.39596: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.39599: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.39602: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.40024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.40424: done with get_vars() 25675 1727203990.40433: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.051) 0:00:09.858 ***** 25675 1727203990.40730: entering _queue_task() for managed-node2/include_tasks 25675 1727203990.41198: worker is 1 (out of 1 available) 25675 1727203990.41212: exiting _queue_task() for managed-node2/include_tasks 25675 1727203990.41224: done queuing things up, now waiting for results queue to drain 25675 1727203990.41225: waiting for pending results... 25675 1727203990.41894: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 25675 1727203990.41900: in run() - task 028d2410-947f-41bd-b19d-000000000012 25675 1727203990.41904: variable 'ansible_search_path' from source: unknown 25675 1727203990.41934: calling self._execute() 25675 1727203990.42407: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.42425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.42429: variable 'omit' from source: magic vars 25675 1727203990.43682: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.43701: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.43714: _execute() done 25675 1727203990.43723: dumping result to json 25675 1727203990.43732: done dumping result, returning 25675 1727203990.43743: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-41bd-b19d-000000000012] 25675 1727203990.44182: sending task result for task 028d2410-947f-41bd-b19d-000000000012 25675 1727203990.44252: done sending task result for task 028d2410-947f-41bd-b19d-000000000012 25675 1727203990.44256: WORKER PROCESS EXITING 25675 1727203990.44288: no more pending results, returning what we have 25675 1727203990.44292: in VariableManager get_vars() 25675 1727203990.44327: Calling all_inventory to load vars for managed-node2 25675 1727203990.44330: Calling groups_inventory to load vars for managed-node2 25675 1727203990.44333: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.44345: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.44348: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.44350: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.44839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.45436: done with get_vars() 25675 1727203990.45444: variable 'ansible_search_path' from source: unknown 25675 1727203990.45457: we have included files to process 25675 1727203990.45458: generating all_blocks data 25675 1727203990.45459: done generating all_blocks data 25675 1727203990.45465: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25675 1727203990.45466: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25675 1727203990.45468: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25675 1727203990.46028: in VariableManager get_vars() 25675 1727203990.46045: done with get_vars() 25675 1727203990.46581: done processing included file 25675 1727203990.46583: iterating over new_blocks loaded from include file 25675 1727203990.46585: in VariableManager get_vars() 25675 1727203990.46597: done with get_vars() 25675 1727203990.46599: filtering new block on tags 25675 1727203990.46617: done filtering new block on tags 25675 1727203990.46620: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 25675 1727203990.46624: extending task lists for all hosts with included blocks 25675 1727203990.48224: done extending task lists 25675 1727203990.48225: done processing included files 25675 1727203990.48226: results queue empty 25675 1727203990.48227: checking for any_errors_fatal 25675 1727203990.48230: done checking for any_errors_fatal 25675 1727203990.48231: checking for max_fail_percentage 25675 1727203990.48232: done checking for max_fail_percentage 25675 1727203990.48233: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.48234: done checking to see if all hosts have failed 25675 1727203990.48235: getting the remaining hosts for this loop 25675 1727203990.48236: done getting the remaining hosts for this loop 25675 1727203990.48238: getting the next task for host managed-node2 25675 1727203990.48242: done getting next task for host managed-node2 25675 1727203990.48244: ^ task is: TASK: Include the task 'get_interface_stat.yml' 25675 1727203990.48246: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.48249: getting variables 25675 1727203990.48249: in VariableManager get_vars() 25675 1727203990.48259: Calling all_inventory to load vars for managed-node2 25675 1727203990.48261: Calling groups_inventory to load vars for managed-node2 25675 1727203990.48264: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.48270: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.48272: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.48277: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.48630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.49010: done with get_vars() 25675 1727203990.49020: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.083) 0:00:09.942 ***** 25675 1727203990.49094: entering _queue_task() for managed-node2/include_tasks 25675 1727203990.49794: worker is 1 (out of 1 available) 25675 1727203990.49804: exiting _queue_task() for managed-node2/include_tasks 25675 1727203990.49815: done queuing things up, now waiting for results queue to drain 25675 1727203990.49816: waiting for pending results... 25675 1727203990.50140: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 25675 1727203990.50336: in run() - task 028d2410-947f-41bd-b19d-0000000001d3 25675 1727203990.50359: variable 'ansible_search_path' from source: unknown 25675 1727203990.50422: variable 'ansible_search_path' from source: unknown 25675 1727203990.50461: calling self._execute() 25675 1727203990.50783: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.50786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.50788: variable 'omit' from source: magic vars 25675 1727203990.51349: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.51414: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.51491: _execute() done 25675 1727203990.51500: dumping result to json 25675 1727203990.51513: done dumping result, returning 25675 1727203990.51523: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-41bd-b19d-0000000001d3] 25675 1727203990.51533: sending task result for task 028d2410-947f-41bd-b19d-0000000001d3 25675 1727203990.51899: done sending task result for task 028d2410-947f-41bd-b19d-0000000001d3 25675 1727203990.51903: WORKER PROCESS EXITING 25675 1727203990.51931: no more pending results, returning what we have 25675 1727203990.51936: in VariableManager get_vars() 25675 1727203990.51968: Calling all_inventory to load vars for managed-node2 25675 1727203990.51971: Calling groups_inventory to load vars for managed-node2 25675 1727203990.51974: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.51991: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.51993: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.51996: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.52195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.52823: done with get_vars() 25675 1727203990.52831: variable 'ansible_search_path' from source: unknown 25675 1727203990.52832: variable 'ansible_search_path' from source: unknown 25675 1727203990.52866: we have included files to process 25675 1727203990.52867: generating all_blocks data 25675 1727203990.52868: done generating all_blocks data 25675 1727203990.52870: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727203990.52871: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727203990.52873: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727203990.53494: done processing included file 25675 1727203990.53496: iterating over new_blocks loaded from include file 25675 1727203990.53498: in VariableManager get_vars() 25675 1727203990.53511: done with get_vars() 25675 1727203990.53513: filtering new block on tags 25675 1727203990.53527: done filtering new block on tags 25675 1727203990.53529: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 25675 1727203990.53533: extending task lists for all hosts with included blocks 25675 1727203990.53626: done extending task lists 25675 1727203990.53627: done processing included files 25675 1727203990.53628: results queue empty 25675 1727203990.53628: checking for any_errors_fatal 25675 1727203990.53631: done checking for any_errors_fatal 25675 1727203990.53632: checking for max_fail_percentage 25675 1727203990.53633: done checking for max_fail_percentage 25675 1727203990.53634: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.53634: done checking to see if all hosts have failed 25675 1727203990.53635: getting the remaining hosts for this loop 25675 1727203990.53636: done getting the remaining hosts for this loop 25675 1727203990.53639: getting the next task for host managed-node2 25675 1727203990.53642: done getting next task for host managed-node2 25675 1727203990.53644: ^ task is: TASK: Get stat for interface {{ interface }} 25675 1727203990.53647: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.53649: getting variables 25675 1727203990.53650: in VariableManager get_vars() 25675 1727203990.53658: Calling all_inventory to load vars for managed-node2 25675 1727203990.53660: Calling groups_inventory to load vars for managed-node2 25675 1727203990.53662: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.53667: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.53669: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.53672: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.54013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203990.54400: done with get_vars() 25675 1727203990.54409: done getting variables 25675 1727203990.54758: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.056) 0:00:09.999 ***** 25675 1727203990.54789: entering _queue_task() for managed-node2/stat 25675 1727203990.55267: worker is 1 (out of 1 available) 25675 1727203990.55680: exiting _queue_task() for managed-node2/stat 25675 1727203990.55690: done queuing things up, now waiting for results queue to drain 25675 1727203990.55692: waiting for pending results... 25675 1727203990.55927: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 25675 1727203990.56004: in run() - task 028d2410-947f-41bd-b19d-00000000021e 25675 1727203990.56028: variable 'ansible_search_path' from source: unknown 25675 1727203990.56383: variable 'ansible_search_path' from source: unknown 25675 1727203990.56387: calling self._execute() 25675 1727203990.56491: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.56494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.56497: variable 'omit' from source: magic vars 25675 1727203990.57108: variable 'ansible_distribution_major_version' from source: facts 25675 1727203990.57267: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203990.57281: variable 'omit' from source: magic vars 25675 1727203990.57327: variable 'omit' from source: magic vars 25675 1727203990.57538: variable 'interface' from source: set_fact 25675 1727203990.57560: variable 'omit' from source: magic vars 25675 1727203990.57623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203990.57719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203990.57821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203990.57845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203990.57861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203990.57935: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203990.57944: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.58021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.58121: Set connection var ansible_shell_type to sh 25675 1727203990.58346: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203990.58349: Set connection var ansible_timeout to 10 25675 1727203990.58352: Set connection var ansible_pipelining to False 25675 1727203990.58354: Set connection var ansible_shell_executable to /bin/sh 25675 1727203990.58357: Set connection var ansible_connection to ssh 25675 1727203990.58359: variable 'ansible_shell_executable' from source: unknown 25675 1727203990.58361: variable 'ansible_connection' from source: unknown 25675 1727203990.58363: variable 'ansible_module_compression' from source: unknown 25675 1727203990.58365: variable 'ansible_shell_type' from source: unknown 25675 1727203990.58367: variable 'ansible_shell_executable' from source: unknown 25675 1727203990.58369: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203990.58371: variable 'ansible_pipelining' from source: unknown 25675 1727203990.58373: variable 'ansible_timeout' from source: unknown 25675 1727203990.58377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203990.58881: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203990.58886: variable 'omit' from source: magic vars 25675 1727203990.58889: starting attempt loop 25675 1727203990.58891: running the handler 25675 1727203990.58893: _low_level_execute_command(): starting 25675 1727203990.58895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203990.60348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203990.60538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.60694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.60780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.62479: stdout chunk (state=3): >>>/root <<< 25675 1727203990.62695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.62699: stdout chunk (state=3): >>><<< 25675 1727203990.62701: stderr chunk (state=3): >>><<< 25675 1727203990.62934: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203990.62938: _low_level_execute_command(): starting 25675 1727203990.62941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269 `" && echo ansible-tmp-1727203990.628419-26566-211520190179269="` echo /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269 `" ) && sleep 0' 25675 1727203990.64095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203990.64202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203990.64219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.64238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.64391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.66266: stdout chunk (state=3): >>>ansible-tmp-1727203990.628419-26566-211520190179269=/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269 <<< 25675 1727203990.66443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.66453: stdout chunk (state=3): >>><<< 25675 1727203990.66464: stderr chunk (state=3): >>><<< 25675 1727203990.66492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203990.628419-26566-211520190179269=/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203990.66681: variable 'ansible_module_compression' from source: unknown 25675 1727203990.66686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25675 1727203990.66882: variable 'ansible_facts' from source: unknown 25675 1727203990.67082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py 25675 1727203990.67234: Sending initial data 25675 1727203990.67245: Sent initial data (152 bytes) 25675 1727203990.68784: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203990.68799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203990.68815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.68836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.68939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.70544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203990.70604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203990.70755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmps66tmfbp /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py <<< 25675 1727203990.70764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py" <<< 25675 1727203990.70851: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmps66tmfbp" to remote "/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py" <<< 25675 1727203990.72381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.72582: stdout chunk (state=3): >>><<< 25675 1727203990.72586: stderr chunk (state=3): >>><<< 25675 1727203990.72588: done transferring module to remote 25675 1727203990.72590: _low_level_execute_command(): starting 25675 1727203990.72592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/ /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py && sleep 0' 25675 1727203990.73898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203990.73981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.74145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.74206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.76157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.76167: stdout chunk (state=3): >>><<< 25675 1727203990.76185: stderr chunk (state=3): >>><<< 25675 1727203990.76482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203990.76491: _low_level_execute_command(): starting 25675 1727203990.76494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/AnsiballZ_stat.py && sleep 0' 25675 1727203990.77450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203990.77587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203990.77667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.77685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.77877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.93100: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28225, "dev": 23, "nlink": 1, "atime": 1727203988.8670998, "mtime": 1727203988.8670998, "ctime": 1727203988.8670998, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25675 1727203990.94671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203990.94678: stdout chunk (state=3): >>><<< 25675 1727203990.94689: stderr chunk (state=3): >>><<< 25675 1727203990.94708: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28225, "dev": 23, "nlink": 1, "atime": 1727203988.8670998, "mtime": 1727203988.8670998, "ctime": 1727203988.8670998, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203990.94760: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203990.94769: _low_level_execute_command(): starting 25675 1727203990.94778: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203990.628419-26566-211520190179269/ > /dev/null 2>&1 && sleep 0' 25675 1727203990.95948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203990.96282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203990.96299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203990.96313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203990.96420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203990.98283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203990.98325: stderr chunk (state=3): >>><<< 25675 1727203990.98390: stdout chunk (state=3): >>><<< 25675 1727203990.98413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203990.98422: handler run complete 25675 1727203990.98468: attempt loop complete, returning result 25675 1727203990.98471: _execute() done 25675 1727203990.98479: dumping result to json 25675 1727203990.98624: done dumping result, returning 25675 1727203990.98634: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [028d2410-947f-41bd-b19d-00000000021e] 25675 1727203990.98669: sending task result for task 028d2410-947f-41bd-b19d-00000000021e 25675 1727203990.98990: done sending task result for task 028d2410-947f-41bd-b19d-00000000021e 25675 1727203990.98993: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727203988.8670998, "block_size": 4096, "blocks": 0, "ctime": 1727203988.8670998, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28225, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1727203988.8670998, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 25675 1727203990.99124: no more pending results, returning what we have 25675 1727203990.99127: results queue empty 25675 1727203990.99128: checking for any_errors_fatal 25675 1727203990.99379: done checking for any_errors_fatal 25675 1727203990.99380: checking for max_fail_percentage 25675 1727203990.99382: done checking for max_fail_percentage 25675 1727203990.99383: checking to see if all hosts have failed and the running result is not ok 25675 1727203990.99384: done checking to see if all hosts have failed 25675 1727203990.99385: getting the remaining hosts for this loop 25675 1727203990.99386: done getting the remaining hosts for this loop 25675 1727203990.99390: getting the next task for host managed-node2 25675 1727203990.99397: done getting next task for host managed-node2 25675 1727203990.99399: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 25675 1727203990.99402: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203990.99406: getting variables 25675 1727203990.99407: in VariableManager get_vars() 25675 1727203990.99499: Calling all_inventory to load vars for managed-node2 25675 1727203990.99502: Calling groups_inventory to load vars for managed-node2 25675 1727203990.99505: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203990.99514: Calling all_plugins_play to load vars for managed-node2 25675 1727203990.99517: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203990.99520: Calling groups_plugins_play to load vars for managed-node2 25675 1727203990.99797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.00192: done with get_vars() 25675 1727203991.00201: done getting variables 25675 1727203991.00415: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 25675 1727203991.00668: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.460) 0:00:10.459 ***** 25675 1727203991.00807: entering _queue_task() for managed-node2/assert 25675 1727203991.00809: Creating lock for assert 25675 1727203991.01891: worker is 1 (out of 1 available) 25675 1727203991.01903: exiting _queue_task() for managed-node2/assert 25675 1727203991.01912: done queuing things up, now waiting for results queue to drain 25675 1727203991.01913: waiting for pending results... 25675 1727203991.02401: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' 25675 1727203991.02406: in run() - task 028d2410-947f-41bd-b19d-0000000001d4 25675 1727203991.02410: variable 'ansible_search_path' from source: unknown 25675 1727203991.02413: variable 'ansible_search_path' from source: unknown 25675 1727203991.02415: calling self._execute() 25675 1727203991.02824: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.02827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.02830: variable 'omit' from source: magic vars 25675 1727203991.03445: variable 'ansible_distribution_major_version' from source: facts 25675 1727203991.03462: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203991.03681: variable 'omit' from source: magic vars 25675 1727203991.03686: variable 'omit' from source: magic vars 25675 1727203991.03881: variable 'interface' from source: set_fact 25675 1727203991.03884: variable 'omit' from source: magic vars 25675 1727203991.04080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203991.04084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203991.04093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203991.04115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203991.04139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203991.04177: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203991.04246: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.04255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.04482: Set connection var ansible_shell_type to sh 25675 1727203991.04495: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203991.04506: Set connection var ansible_timeout to 10 25675 1727203991.04577: Set connection var ansible_pipelining to False 25675 1727203991.04588: Set connection var ansible_shell_executable to /bin/sh 25675 1727203991.04596: Set connection var ansible_connection to ssh 25675 1727203991.04626: variable 'ansible_shell_executable' from source: unknown 25675 1727203991.04682: variable 'ansible_connection' from source: unknown 25675 1727203991.04689: variable 'ansible_module_compression' from source: unknown 25675 1727203991.04695: variable 'ansible_shell_type' from source: unknown 25675 1727203991.04702: variable 'ansible_shell_executable' from source: unknown 25675 1727203991.04708: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.04714: variable 'ansible_pipelining' from source: unknown 25675 1727203991.04720: variable 'ansible_timeout' from source: unknown 25675 1727203991.04726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.05000: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203991.05180: variable 'omit' from source: magic vars 25675 1727203991.05183: starting attempt loop 25675 1727203991.05185: running the handler 25675 1727203991.05345: variable 'interface_stat' from source: set_fact 25675 1727203991.05544: Evaluated conditional (interface_stat.stat.exists): True 25675 1727203991.05547: handler run complete 25675 1727203991.05549: attempt loop complete, returning result 25675 1727203991.05551: _execute() done 25675 1727203991.05553: dumping result to json 25675 1727203991.05554: done dumping result, returning 25675 1727203991.05556: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' [028d2410-947f-41bd-b19d-0000000001d4] 25675 1727203991.05558: sending task result for task 028d2410-947f-41bd-b19d-0000000001d4 25675 1727203991.05626: done sending task result for task 028d2410-947f-41bd-b19d-0000000001d4 25675 1727203991.05629: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 25675 1727203991.05697: no more pending results, returning what we have 25675 1727203991.05701: results queue empty 25675 1727203991.05702: checking for any_errors_fatal 25675 1727203991.05710: done checking for any_errors_fatal 25675 1727203991.05711: checking for max_fail_percentage 25675 1727203991.05713: done checking for max_fail_percentage 25675 1727203991.05713: checking to see if all hosts have failed and the running result is not ok 25675 1727203991.05714: done checking to see if all hosts have failed 25675 1727203991.05715: getting the remaining hosts for this loop 25675 1727203991.05717: done getting the remaining hosts for this loop 25675 1727203991.05720: getting the next task for host managed-node2 25675 1727203991.05729: done getting next task for host managed-node2 25675 1727203991.05731: ^ task is: TASK: meta (flush_handlers) 25675 1727203991.05733: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203991.05738: getting variables 25675 1727203991.05739: in VariableManager get_vars() 25675 1727203991.05770: Calling all_inventory to load vars for managed-node2 25675 1727203991.05777: Calling groups_inventory to load vars for managed-node2 25675 1727203991.05781: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203991.05793: Calling all_plugins_play to load vars for managed-node2 25675 1727203991.05796: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203991.05800: Calling groups_plugins_play to load vars for managed-node2 25675 1727203991.06399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.06913: done with get_vars() 25675 1727203991.06924: done getting variables 25675 1727203991.07108: in VariableManager get_vars() 25675 1727203991.07118: Calling all_inventory to load vars for managed-node2 25675 1727203991.07121: Calling groups_inventory to load vars for managed-node2 25675 1727203991.07123: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203991.07127: Calling all_plugins_play to load vars for managed-node2 25675 1727203991.07130: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203991.07132: Calling groups_plugins_play to load vars for managed-node2 25675 1727203991.07494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.07862: done with get_vars() 25675 1727203991.07881: done queuing things up, now waiting for results queue to drain 25675 1727203991.07883: results queue empty 25675 1727203991.07884: checking for any_errors_fatal 25675 1727203991.07886: done checking for any_errors_fatal 25675 1727203991.07887: checking for max_fail_percentage 25675 1727203991.07888: done checking for max_fail_percentage 25675 1727203991.07889: checking to see if all hosts have failed and the running result is not ok 25675 1727203991.07890: done checking to see if all hosts have failed 25675 1727203991.07896: getting the remaining hosts for this loop 25675 1727203991.07897: done getting the remaining hosts for this loop 25675 1727203991.07899: getting the next task for host managed-node2 25675 1727203991.07903: done getting next task for host managed-node2 25675 1727203991.07904: ^ task is: TASK: meta (flush_handlers) 25675 1727203991.07906: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203991.07908: getting variables 25675 1727203991.07909: in VariableManager get_vars() 25675 1727203991.07917: Calling all_inventory to load vars for managed-node2 25675 1727203991.07919: Calling groups_inventory to load vars for managed-node2 25675 1727203991.07921: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203991.07926: Calling all_plugins_play to load vars for managed-node2 25675 1727203991.07928: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203991.07931: Calling groups_plugins_play to load vars for managed-node2 25675 1727203991.08304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.08734: done with get_vars() 25675 1727203991.08742: done getting variables 25675 1727203991.08792: in VariableManager get_vars() 25675 1727203991.08800: Calling all_inventory to load vars for managed-node2 25675 1727203991.08802: Calling groups_inventory to load vars for managed-node2 25675 1727203991.08804: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203991.08809: Calling all_plugins_play to load vars for managed-node2 25675 1727203991.08811: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203991.08813: Calling groups_plugins_play to load vars for managed-node2 25675 1727203991.09125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.09533: done with get_vars() 25675 1727203991.09546: done queuing things up, now waiting for results queue to drain 25675 1727203991.09548: results queue empty 25675 1727203991.09549: checking for any_errors_fatal 25675 1727203991.09550: done checking for any_errors_fatal 25675 1727203991.09551: checking for max_fail_percentage 25675 1727203991.09552: done checking for max_fail_percentage 25675 1727203991.09553: checking to see if all hosts have failed and the running result is not ok 25675 1727203991.09554: done checking to see if all hosts have failed 25675 1727203991.09554: getting the remaining hosts for this loop 25675 1727203991.09555: done getting the remaining hosts for this loop 25675 1727203991.09558: getting the next task for host managed-node2 25675 1727203991.09561: done getting next task for host managed-node2 25675 1727203991.09561: ^ task is: None 25675 1727203991.09563: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203991.09564: done queuing things up, now waiting for results queue to drain 25675 1727203991.09565: results queue empty 25675 1727203991.09565: checking for any_errors_fatal 25675 1727203991.09566: done checking for any_errors_fatal 25675 1727203991.09567: checking for max_fail_percentage 25675 1727203991.09568: done checking for max_fail_percentage 25675 1727203991.09568: checking to see if all hosts have failed and the running result is not ok 25675 1727203991.09569: done checking to see if all hosts have failed 25675 1727203991.09687: getting the next task for host managed-node2 25675 1727203991.09691: done getting next task for host managed-node2 25675 1727203991.09692: ^ task is: None 25675 1727203991.09694: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203991.09812: in VariableManager get_vars() 25675 1727203991.09834: done with get_vars() 25675 1727203991.09841: in VariableManager get_vars() 25675 1727203991.09853: done with get_vars() 25675 1727203991.09858: variable 'omit' from source: magic vars 25675 1727203991.10030: in VariableManager get_vars() 25675 1727203991.10046: done with get_vars() 25675 1727203991.10069: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 25675 1727203991.11531: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727203991.11652: getting the remaining hosts for this loop 25675 1727203991.11654: done getting the remaining hosts for this loop 25675 1727203991.11656: getting the next task for host managed-node2 25675 1727203991.11659: done getting next task for host managed-node2 25675 1727203991.11661: ^ task is: TASK: Gathering Facts 25675 1727203991.11663: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203991.11665: getting variables 25675 1727203991.11666: in VariableManager get_vars() 25675 1727203991.11723: Calling all_inventory to load vars for managed-node2 25675 1727203991.11726: Calling groups_inventory to load vars for managed-node2 25675 1727203991.11728: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203991.11733: Calling all_plugins_play to load vars for managed-node2 25675 1727203991.11736: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203991.11738: Calling groups_plugins_play to load vars for managed-node2 25675 1727203991.12108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203991.12530: done with get_vars() 25675 1727203991.12538: done getting variables 25675 1727203991.12587: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.118) 0:00:10.577 ***** 25675 1727203991.12611: entering _queue_task() for managed-node2/gather_facts 25675 1727203991.13243: worker is 1 (out of 1 available) 25675 1727203991.13254: exiting _queue_task() for managed-node2/gather_facts 25675 1727203991.13267: done queuing things up, now waiting for results queue to drain 25675 1727203991.13268: waiting for pending results... 25675 1727203991.13811: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727203991.13914: in run() - task 028d2410-947f-41bd-b19d-000000000237 25675 1727203991.13937: variable 'ansible_search_path' from source: unknown 25675 1727203991.13974: calling self._execute() 25675 1727203991.14172: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.14186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.14200: variable 'omit' from source: magic vars 25675 1727203991.14940: variable 'ansible_distribution_major_version' from source: facts 25675 1727203991.14955: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203991.15012: variable 'omit' from source: magic vars 25675 1727203991.15044: variable 'omit' from source: magic vars 25675 1727203991.15152: variable 'omit' from source: magic vars 25675 1727203991.15197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203991.15284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203991.15351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203991.15577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203991.15581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203991.15583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203991.15585: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.15587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.15721: Set connection var ansible_shell_type to sh 25675 1727203991.15772: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203991.15788: Set connection var ansible_timeout to 10 25675 1727203991.15798: Set connection var ansible_pipelining to False 25675 1727203991.15807: Set connection var ansible_shell_executable to /bin/sh 25675 1727203991.16007: Set connection var ansible_connection to ssh 25675 1727203991.16010: variable 'ansible_shell_executable' from source: unknown 25675 1727203991.16012: variable 'ansible_connection' from source: unknown 25675 1727203991.16015: variable 'ansible_module_compression' from source: unknown 25675 1727203991.16017: variable 'ansible_shell_type' from source: unknown 25675 1727203991.16019: variable 'ansible_shell_executable' from source: unknown 25675 1727203991.16021: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203991.16023: variable 'ansible_pipelining' from source: unknown 25675 1727203991.16025: variable 'ansible_timeout' from source: unknown 25675 1727203991.16027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203991.16298: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203991.16484: variable 'omit' from source: magic vars 25675 1727203991.16487: starting attempt loop 25675 1727203991.16490: running the handler 25675 1727203991.16492: variable 'ansible_facts' from source: unknown 25675 1727203991.16494: _low_level_execute_command(): starting 25675 1727203991.16502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203991.18031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.18047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203991.18124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203991.18223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203991.19910: stdout chunk (state=3): >>>/root <<< 25675 1727203991.20041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203991.20131: stderr chunk (state=3): >>><<< 25675 1727203991.20141: stdout chunk (state=3): >>><<< 25675 1727203991.20202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203991.20372: _low_level_execute_command(): starting 25675 1727203991.20378: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526 `" && echo ansible-tmp-1727203991.202825-26590-150936874959526="` echo /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526 `" ) && sleep 0' 25675 1727203991.21540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203991.21553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.21573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.21693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203991.21792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203991.21862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203991.23764: stdout chunk (state=3): >>>ansible-tmp-1727203991.202825-26590-150936874959526=/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526 <<< 25675 1727203991.23862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203991.23968: stderr chunk (state=3): >>><<< 25675 1727203991.23971: stdout chunk (state=3): >>><<< 25675 1727203991.24007: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203991.202825-26590-150936874959526=/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203991.24041: variable 'ansible_module_compression' from source: unknown 25675 1727203991.24485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727203991.24488: variable 'ansible_facts' from source: unknown 25675 1727203991.24720: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py 25675 1727203991.25199: Sending initial data 25675 1727203991.25207: Sent initial data (153 bytes) 25675 1727203991.25919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203991.25935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203991.25958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203991.25992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.26007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203991.26020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203991.26062: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.26116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203991.26141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203991.26178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203991.26274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203991.27880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203991.27950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203991.28147: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmped0pfwf0 /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py <<< 25675 1727203991.28151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py" <<< 25675 1727203991.28191: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmped0pfwf0" to remote "/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py" <<< 25675 1727203991.30322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203991.30339: stdout chunk (state=3): >>><<< 25675 1727203991.30349: stderr chunk (state=3): >>><<< 25675 1727203991.30382: done transferring module to remote 25675 1727203991.30396: _low_level_execute_command(): starting 25675 1727203991.30404: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/ /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py && sleep 0' 25675 1727203991.31050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203991.31065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203991.31167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203991.31200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203991.31218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203991.31240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203991.31339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203991.33089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203991.33116: stderr chunk (state=3): >>><<< 25675 1727203991.33127: stdout chunk (state=3): >>><<< 25675 1727203991.33140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203991.33142: _low_level_execute_command(): starting 25675 1727203991.33148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/AnsiballZ_setup.py && sleep 0' 25675 1727203991.33615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203991.33684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203991.33708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203991.33826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.00060: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "a<<< 25675 1727203992.00068: stdout chunk (state=3): >>>nsible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2912, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 619, "free": 2912}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 577, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785718784, "block_size": 4096, "block_total": 65519099, "block_available": 63912529, "block_used": 1606570, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_interfaces": ["lsr27", "lo", "peerlsr27", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gat<<< 25675 1727203992.00089: stdout chunk (state=3): >>>her_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segme<<< 25675 1727203992.00113: stdout chunk (state=3): >>>ntation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "11", "epoch": "1727203991", "epoch_int": "1727203991", "date": "2024-09-24", "time": "14:53:11", "iso8601_micro": "2024-09-24T18:53:11.997547Z", "iso8601": "2024-09-24T18:53:11Z", "iso8601_basic": "20240924T145311997547", "iso8601_basic_short": "20240924T145311", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.5009765625, "5m": 0.4267578125, "15m": 0.224609375}, "ansible_<<< 25675 1727203992.00117: stdout chunk (state=3): >>>python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727203992.02086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203992.02108: stderr chunk (state=3): >>><<< 25675 1727203992.02111: stdout chunk (state=3): >>><<< 25675 1727203992.02232: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2912, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 619, "free": 2912}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 577, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785718784, "block_size": 4096, "block_total": 65519099, "block_available": 63912529, "block_used": 1606570, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_interfaces": ["lsr27", "lo", "peerlsr27", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "11", "epoch": "1727203991", "epoch_int": "1727203991", "date": "2024-09-24", "time": "14:53:11", "iso8601_micro": "2024-09-24T18:53:11.997547Z", "iso8601": "2024-09-24T18:53:11Z", "iso8601_basic": "20240924T145311997547", "iso8601_basic_short": "20240924T145311", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.5009765625, "5m": 0.4267578125, "15m": 0.224609375}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203992.02719: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203992.02734: _low_level_execute_command(): starting 25675 1727203992.02739: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203991.202825-26590-150936874959526/ > /dev/null 2>&1 && sleep 0' 25675 1727203992.03173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203992.03178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203992.03181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203992.03183: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203992.03185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203992.03251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.03354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.05196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203992.05200: stderr chunk (state=3): >>><<< 25675 1727203992.05202: stdout chunk (state=3): >>><<< 25675 1727203992.05216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203992.05222: handler run complete 25675 1727203992.05314: variable 'ansible_facts' from source: unknown 25675 1727203992.05380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.05607: variable 'ansible_facts' from source: unknown 25675 1727203992.05668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.05760: attempt loop complete, returning result 25675 1727203992.05764: _execute() done 25675 1727203992.05766: dumping result to json 25675 1727203992.05794: done dumping result, returning 25675 1727203992.05801: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-000000000237] 25675 1727203992.05805: sending task result for task 028d2410-947f-41bd-b19d-000000000237 25675 1727203992.06336: done sending task result for task 028d2410-947f-41bd-b19d-000000000237 25675 1727203992.06339: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727203992.06526: no more pending results, returning what we have 25675 1727203992.06528: results queue empty 25675 1727203992.06529: checking for any_errors_fatal 25675 1727203992.06530: done checking for any_errors_fatal 25675 1727203992.06531: checking for max_fail_percentage 25675 1727203992.06532: done checking for max_fail_percentage 25675 1727203992.06533: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.06534: done checking to see if all hosts have failed 25675 1727203992.06535: getting the remaining hosts for this loop 25675 1727203992.06536: done getting the remaining hosts for this loop 25675 1727203992.06542: getting the next task for host managed-node2 25675 1727203992.06546: done getting next task for host managed-node2 25675 1727203992.06546: ^ task is: TASK: meta (flush_handlers) 25675 1727203992.06548: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.06550: getting variables 25675 1727203992.06551: in VariableManager get_vars() 25675 1727203992.06569: Calling all_inventory to load vars for managed-node2 25675 1727203992.06572: Calling groups_inventory to load vars for managed-node2 25675 1727203992.06574: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.06583: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.06585: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.06586: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.06732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.06958: done with get_vars() 25675 1727203992.06967: done getting variables 25675 1727203992.07043: in VariableManager get_vars() 25675 1727203992.07055: Calling all_inventory to load vars for managed-node2 25675 1727203992.07057: Calling groups_inventory to load vars for managed-node2 25675 1727203992.07062: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.07068: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.07072: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.07077: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.07231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.07616: done with get_vars() 25675 1727203992.07628: done queuing things up, now waiting for results queue to drain 25675 1727203992.07630: results queue empty 25675 1727203992.07631: checking for any_errors_fatal 25675 1727203992.07634: done checking for any_errors_fatal 25675 1727203992.07635: checking for max_fail_percentage 25675 1727203992.07636: done checking for max_fail_percentage 25675 1727203992.07640: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.07641: done checking to see if all hosts have failed 25675 1727203992.07641: getting the remaining hosts for this loop 25675 1727203992.07642: done getting the remaining hosts for this loop 25675 1727203992.07645: getting the next task for host managed-node2 25675 1727203992.07649: done getting next task for host managed-node2 25675 1727203992.07651: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727203992.07653: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.07662: getting variables 25675 1727203992.07663: in VariableManager get_vars() 25675 1727203992.07844: Calling all_inventory to load vars for managed-node2 25675 1727203992.07853: Calling groups_inventory to load vars for managed-node2 25675 1727203992.07855: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.07860: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.07862: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.07866: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.08074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.08343: done with get_vars() 25675 1727203992.08351: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:12 -0400 (0:00:00.958) 0:00:11.535 ***** 25675 1727203992.08426: entering _queue_task() for managed-node2/include_tasks 25675 1727203992.08661: worker is 1 (out of 1 available) 25675 1727203992.08680: exiting _queue_task() for managed-node2/include_tasks 25675 1727203992.08696: done queuing things up, now waiting for results queue to drain 25675 1727203992.08697: waiting for pending results... 25675 1727203992.08854: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727203992.08925: in run() - task 028d2410-947f-41bd-b19d-000000000019 25675 1727203992.08937: variable 'ansible_search_path' from source: unknown 25675 1727203992.08940: variable 'ansible_search_path' from source: unknown 25675 1727203992.08969: calling self._execute() 25675 1727203992.09032: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.09037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.09046: variable 'omit' from source: magic vars 25675 1727203992.09318: variable 'ansible_distribution_major_version' from source: facts 25675 1727203992.09326: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203992.09331: _execute() done 25675 1727203992.09334: dumping result to json 25675 1727203992.09338: done dumping result, returning 25675 1727203992.09345: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-41bd-b19d-000000000019] 25675 1727203992.09350: sending task result for task 028d2410-947f-41bd-b19d-000000000019 25675 1727203992.09437: done sending task result for task 028d2410-947f-41bd-b19d-000000000019 25675 1727203992.09440: WORKER PROCESS EXITING 25675 1727203992.09501: no more pending results, returning what we have 25675 1727203992.09505: in VariableManager get_vars() 25675 1727203992.09539: Calling all_inventory to load vars for managed-node2 25675 1727203992.09541: Calling groups_inventory to load vars for managed-node2 25675 1727203992.09543: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.09552: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.09555: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.09557: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.09699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.09834: done with get_vars() 25675 1727203992.09841: variable 'ansible_search_path' from source: unknown 25675 1727203992.09841: variable 'ansible_search_path' from source: unknown 25675 1727203992.09859: we have included files to process 25675 1727203992.09860: generating all_blocks data 25675 1727203992.09861: done generating all_blocks data 25675 1727203992.09861: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727203992.09862: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727203992.09864: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727203992.10336: done processing included file 25675 1727203992.10338: iterating over new_blocks loaded from include file 25675 1727203992.10339: in VariableManager get_vars() 25675 1727203992.10350: done with get_vars() 25675 1727203992.10351: filtering new block on tags 25675 1727203992.10361: done filtering new block on tags 25675 1727203992.10362: in VariableManager get_vars() 25675 1727203992.10377: done with get_vars() 25675 1727203992.10379: filtering new block on tags 25675 1727203992.10391: done filtering new block on tags 25675 1727203992.10393: in VariableManager get_vars() 25675 1727203992.10403: done with get_vars() 25675 1727203992.10404: filtering new block on tags 25675 1727203992.10413: done filtering new block on tags 25675 1727203992.10414: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 25675 1727203992.10417: extending task lists for all hosts with included blocks 25675 1727203992.10620: done extending task lists 25675 1727203992.10621: done processing included files 25675 1727203992.10622: results queue empty 25675 1727203992.10622: checking for any_errors_fatal 25675 1727203992.10623: done checking for any_errors_fatal 25675 1727203992.10624: checking for max_fail_percentage 25675 1727203992.10624: done checking for max_fail_percentage 25675 1727203992.10625: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.10625: done checking to see if all hosts have failed 25675 1727203992.10626: getting the remaining hosts for this loop 25675 1727203992.10627: done getting the remaining hosts for this loop 25675 1727203992.10628: getting the next task for host managed-node2 25675 1727203992.10630: done getting next task for host managed-node2 25675 1727203992.10632: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727203992.10633: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.10639: getting variables 25675 1727203992.10640: in VariableManager get_vars() 25675 1727203992.10648: Calling all_inventory to load vars for managed-node2 25675 1727203992.10649: Calling groups_inventory to load vars for managed-node2 25675 1727203992.10650: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.10653: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.10655: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.10656: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.10769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.10987: done with get_vars() 25675 1727203992.10995: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:12 -0400 (0:00:00.026) 0:00:11.561 ***** 25675 1727203992.11058: entering _queue_task() for managed-node2/setup 25675 1727203992.11600: worker is 1 (out of 1 available) 25675 1727203992.11609: exiting _queue_task() for managed-node2/setup 25675 1727203992.11617: done queuing things up, now waiting for results queue to drain 25675 1727203992.11619: waiting for pending results... 25675 1727203992.11854: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727203992.11858: in run() - task 028d2410-947f-41bd-b19d-000000000279 25675 1727203992.11862: variable 'ansible_search_path' from source: unknown 25675 1727203992.11866: variable 'ansible_search_path' from source: unknown 25675 1727203992.11868: calling self._execute() 25675 1727203992.11952: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.11965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.11980: variable 'omit' from source: magic vars 25675 1727203992.12332: variable 'ansible_distribution_major_version' from source: facts 25675 1727203992.12348: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203992.12562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203992.14673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203992.14752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203992.14796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203992.14833: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203992.14868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203992.14949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203992.15081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203992.15085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203992.15088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203992.15090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203992.15133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203992.15161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203992.15192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203992.15240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203992.15260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203992.15424: variable '__network_required_facts' from source: role '' defaults 25675 1727203992.15439: variable 'ansible_facts' from source: unknown 25675 1727203992.15640: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25675 1727203992.15644: when evaluation is False, skipping this task 25675 1727203992.15646: _execute() done 25675 1727203992.15648: dumping result to json 25675 1727203992.15650: done dumping result, returning 25675 1727203992.15653: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-41bd-b19d-000000000279] 25675 1727203992.15655: sending task result for task 028d2410-947f-41bd-b19d-000000000279 25675 1727203992.15730: done sending task result for task 028d2410-947f-41bd-b19d-000000000279 25675 1727203992.15734: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727203992.15788: no more pending results, returning what we have 25675 1727203992.15794: results queue empty 25675 1727203992.15795: checking for any_errors_fatal 25675 1727203992.15796: done checking for any_errors_fatal 25675 1727203992.15797: checking for max_fail_percentage 25675 1727203992.15798: done checking for max_fail_percentage 25675 1727203992.15799: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.15800: done checking to see if all hosts have failed 25675 1727203992.15801: getting the remaining hosts for this loop 25675 1727203992.15802: done getting the remaining hosts for this loop 25675 1727203992.15806: getting the next task for host managed-node2 25675 1727203992.15814: done getting next task for host managed-node2 25675 1727203992.15818: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727203992.15820: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.15835: getting variables 25675 1727203992.15837: in VariableManager get_vars() 25675 1727203992.15879: Calling all_inventory to load vars for managed-node2 25675 1727203992.15882: Calling groups_inventory to load vars for managed-node2 25675 1727203992.15884: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.15895: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.15898: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.15901: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.16307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.16599: done with get_vars() 25675 1727203992.16612: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:12 -0400 (0:00:00.056) 0:00:11.618 ***** 25675 1727203992.16705: entering _queue_task() for managed-node2/stat 25675 1727203992.16956: worker is 1 (out of 1 available) 25675 1727203992.16968: exiting _queue_task() for managed-node2/stat 25675 1727203992.16981: done queuing things up, now waiting for results queue to drain 25675 1727203992.16982: waiting for pending results... 25675 1727203992.17391: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727203992.17396: in run() - task 028d2410-947f-41bd-b19d-00000000027b 25675 1727203992.17398: variable 'ansible_search_path' from source: unknown 25675 1727203992.17400: variable 'ansible_search_path' from source: unknown 25675 1727203992.17415: calling self._execute() 25675 1727203992.17491: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.17501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.17518: variable 'omit' from source: magic vars 25675 1727203992.17936: variable 'ansible_distribution_major_version' from source: facts 25675 1727203992.17954: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203992.18097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203992.18348: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203992.18398: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203992.18435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203992.18474: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203992.18560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203992.18590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203992.18625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203992.18655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203992.18815: variable '__network_is_ostree' from source: set_fact 25675 1727203992.18818: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727203992.18820: when evaluation is False, skipping this task 25675 1727203992.18822: _execute() done 25675 1727203992.18824: dumping result to json 25675 1727203992.18825: done dumping result, returning 25675 1727203992.18828: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-41bd-b19d-00000000027b] 25675 1727203992.18829: sending task result for task 028d2410-947f-41bd-b19d-00000000027b 25675 1727203992.18898: done sending task result for task 028d2410-947f-41bd-b19d-00000000027b 25675 1727203992.18901: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727203992.18967: no more pending results, returning what we have 25675 1727203992.18970: results queue empty 25675 1727203992.18971: checking for any_errors_fatal 25675 1727203992.18978: done checking for any_errors_fatal 25675 1727203992.18979: checking for max_fail_percentage 25675 1727203992.18981: done checking for max_fail_percentage 25675 1727203992.18982: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.18983: done checking to see if all hosts have failed 25675 1727203992.18984: getting the remaining hosts for this loop 25675 1727203992.18985: done getting the remaining hosts for this loop 25675 1727203992.18989: getting the next task for host managed-node2 25675 1727203992.18995: done getting next task for host managed-node2 25675 1727203992.18999: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727203992.19001: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.19015: getting variables 25675 1727203992.19017: in VariableManager get_vars() 25675 1727203992.19053: Calling all_inventory to load vars for managed-node2 25675 1727203992.19056: Calling groups_inventory to load vars for managed-node2 25675 1727203992.19058: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.19068: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.19071: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.19074: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.19579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.19807: done with get_vars() 25675 1727203992.19818: done getting variables 25675 1727203992.19874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:12 -0400 (0:00:00.032) 0:00:11.650 ***** 25675 1727203992.19908: entering _queue_task() for managed-node2/set_fact 25675 1727203992.20168: worker is 1 (out of 1 available) 25675 1727203992.20182: exiting _queue_task() for managed-node2/set_fact 25675 1727203992.20193: done queuing things up, now waiting for results queue to drain 25675 1727203992.20195: waiting for pending results... 25675 1727203992.20596: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727203992.20601: in run() - task 028d2410-947f-41bd-b19d-00000000027c 25675 1727203992.20604: variable 'ansible_search_path' from source: unknown 25675 1727203992.20607: variable 'ansible_search_path' from source: unknown 25675 1727203992.20623: calling self._execute() 25675 1727203992.20713: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.20724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.20737: variable 'omit' from source: magic vars 25675 1727203992.21097: variable 'ansible_distribution_major_version' from source: facts 25675 1727203992.21113: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203992.21281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203992.21565: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203992.21616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203992.21654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203992.21731: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203992.21823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203992.21852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203992.21891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203992.21924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203992.22021: variable '__network_is_ostree' from source: set_fact 25675 1727203992.22033: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727203992.22040: when evaluation is False, skipping this task 25675 1727203992.22046: _execute() done 25675 1727203992.22053: dumping result to json 25675 1727203992.22061: done dumping result, returning 25675 1727203992.22071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-41bd-b19d-00000000027c] 25675 1727203992.22084: sending task result for task 028d2410-947f-41bd-b19d-00000000027c skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727203992.22259: no more pending results, returning what we have 25675 1727203992.22262: results queue empty 25675 1727203992.22264: checking for any_errors_fatal 25675 1727203992.22269: done checking for any_errors_fatal 25675 1727203992.22270: checking for max_fail_percentage 25675 1727203992.22271: done checking for max_fail_percentage 25675 1727203992.22272: checking to see if all hosts have failed and the running result is not ok 25675 1727203992.22273: done checking to see if all hosts have failed 25675 1727203992.22274: getting the remaining hosts for this loop 25675 1727203992.22277: done getting the remaining hosts for this loop 25675 1727203992.22282: getting the next task for host managed-node2 25675 1727203992.22292: done getting next task for host managed-node2 25675 1727203992.22296: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727203992.22298: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203992.22312: getting variables 25675 1727203992.22314: in VariableManager get_vars() 25675 1727203992.22353: Calling all_inventory to load vars for managed-node2 25675 1727203992.22356: Calling groups_inventory to load vars for managed-node2 25675 1727203992.22359: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203992.22369: Calling all_plugins_play to load vars for managed-node2 25675 1727203992.22372: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203992.22480: Calling groups_plugins_play to load vars for managed-node2 25675 1727203992.22493: done sending task result for task 028d2410-947f-41bd-b19d-00000000027c 25675 1727203992.22496: WORKER PROCESS EXITING 25675 1727203992.22846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203992.23135: done with get_vars() 25675 1727203992.23147: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:12 -0400 (0:00:00.033) 0:00:11.683 ***** 25675 1727203992.23241: entering _queue_task() for managed-node2/service_facts 25675 1727203992.23243: Creating lock for service_facts 25675 1727203992.23708: worker is 1 (out of 1 available) 25675 1727203992.23717: exiting _queue_task() for managed-node2/service_facts 25675 1727203992.23725: done queuing things up, now waiting for results queue to drain 25675 1727203992.23727: waiting for pending results... 25675 1727203992.23790: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727203992.23906: in run() - task 028d2410-947f-41bd-b19d-00000000027e 25675 1727203992.23927: variable 'ansible_search_path' from source: unknown 25675 1727203992.23934: variable 'ansible_search_path' from source: unknown 25675 1727203992.23980: calling self._execute() 25675 1727203992.24063: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.24077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.24093: variable 'omit' from source: magic vars 25675 1727203992.24481: variable 'ansible_distribution_major_version' from source: facts 25675 1727203992.24500: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203992.24510: variable 'omit' from source: magic vars 25675 1727203992.24560: variable 'omit' from source: magic vars 25675 1727203992.24603: variable 'omit' from source: magic vars 25675 1727203992.24645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203992.24707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203992.24982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203992.24986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203992.24988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203992.24991: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203992.24993: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.24995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.25063: Set connection var ansible_shell_type to sh 25675 1727203992.25100: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203992.25113: Set connection var ansible_timeout to 10 25675 1727203992.25281: Set connection var ansible_pipelining to False 25675 1727203992.25284: Set connection var ansible_shell_executable to /bin/sh 25675 1727203992.25287: Set connection var ansible_connection to ssh 25675 1727203992.25290: variable 'ansible_shell_executable' from source: unknown 25675 1727203992.25292: variable 'ansible_connection' from source: unknown 25675 1727203992.25295: variable 'ansible_module_compression' from source: unknown 25675 1727203992.25297: variable 'ansible_shell_type' from source: unknown 25675 1727203992.25300: variable 'ansible_shell_executable' from source: unknown 25675 1727203992.25304: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203992.25307: variable 'ansible_pipelining' from source: unknown 25675 1727203992.25309: variable 'ansible_timeout' from source: unknown 25675 1727203992.25311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203992.25862: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203992.25867: variable 'omit' from source: magic vars 25675 1727203992.25869: starting attempt loop 25675 1727203992.25871: running the handler 25675 1727203992.25873: _low_level_execute_command(): starting 25675 1727203992.25877: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203992.26787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203992.26915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203992.26918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203992.26921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.27030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.28708: stdout chunk (state=3): >>>/root <<< 25675 1727203992.28866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203992.28871: stdout chunk (state=3): >>><<< 25675 1727203992.28873: stderr chunk (state=3): >>><<< 25675 1727203992.28895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203992.28997: _low_level_execute_command(): starting 25675 1727203992.29001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742 `" && echo ansible-tmp-1727203992.289032-26638-34064232112742="` echo /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742 `" ) && sleep 0' 25675 1727203992.29682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203992.29854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203992.30131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.30213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.32096: stdout chunk (state=3): >>>ansible-tmp-1727203992.289032-26638-34064232112742=/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742 <<< 25675 1727203992.32232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203992.32254: stdout chunk (state=3): >>><<< 25675 1727203992.32278: stderr chunk (state=3): >>><<< 25675 1727203992.32300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203992.289032-26638-34064232112742=/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203992.32389: variable 'ansible_module_compression' from source: unknown 25675 1727203992.32467: ANSIBALLZ: Using lock for service_facts 25675 1727203992.32470: ANSIBALLZ: Acquiring lock 25675 1727203992.32473: ANSIBALLZ: Lock acquired: 139822502947072 25675 1727203992.32475: ANSIBALLZ: Creating module 25675 1727203992.49289: ANSIBALLZ: Writing module into payload 25675 1727203992.49355: ANSIBALLZ: Writing module 25675 1727203992.49404: ANSIBALLZ: Renaming module 25675 1727203992.49408: ANSIBALLZ: Done creating module 25675 1727203992.49411: variable 'ansible_facts' from source: unknown 25675 1727203992.49493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py 25675 1727203992.49783: Sending initial data 25675 1727203992.49786: Sent initial data (160 bytes) 25675 1727203992.50691: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203992.50750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203992.50762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203992.50772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.50881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.52768: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203992.52773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203992.52818: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp7jpihomo /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py <<< 25675 1727203992.52821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py" <<< 25675 1727203992.53106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp7jpihomo" to remote "/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py" <<< 25675 1727203992.54631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203992.54699: stderr chunk (state=3): >>><<< 25675 1727203992.54720: stdout chunk (state=3): >>><<< 25675 1727203992.54882: done transferring module to remote 25675 1727203992.54886: _low_level_execute_command(): starting 25675 1727203992.54889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/ /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py && sleep 0' 25675 1727203992.56091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203992.56237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203992.56356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203992.56369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.56472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203992.58352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203992.58377: stdout chunk (state=3): >>><<< 25675 1727203992.58395: stderr chunk (state=3): >>><<< 25675 1727203992.58414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203992.58568: _low_level_execute_command(): starting 25675 1727203992.58573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/AnsiballZ_service_facts.py && sleep 0' 25675 1727203992.59567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203992.59583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203992.59610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203992.59638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203992.59734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203992.59763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203992.59779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203992.60006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.12970: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 25675 1727203994.13105: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 25675 1727203994.13124: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25675 1727203994.14533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203994.14574: stderr chunk (state=3): >>><<< 25675 1727203994.14682: stdout chunk (state=3): >>><<< 25675 1727203994.14688: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203994.16984: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203994.16988: _low_level_execute_command(): starting 25675 1727203994.16990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203992.289032-26638-34064232112742/ > /dev/null 2>&1 && sleep 0' 25675 1727203994.17773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203994.17788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.17799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203994.17825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203994.17930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203994.17948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.18054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.20042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203994.20046: stdout chunk (state=3): >>><<< 25675 1727203994.20048: stderr chunk (state=3): >>><<< 25675 1727203994.20119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203994.20122: handler run complete 25675 1727203994.20458: variable 'ansible_facts' from source: unknown 25675 1727203994.20883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203994.21784: variable 'ansible_facts' from source: unknown 25675 1727203994.21926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203994.22147: attempt loop complete, returning result 25675 1727203994.22150: _execute() done 25675 1727203994.22155: dumping result to json 25675 1727203994.22226: done dumping result, returning 25675 1727203994.22236: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-41bd-b19d-00000000027e] 25675 1727203994.22240: sending task result for task 028d2410-947f-41bd-b19d-00000000027e ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727203994.23333: no more pending results, returning what we have 25675 1727203994.23337: results queue empty 25675 1727203994.23338: checking for any_errors_fatal 25675 1727203994.23344: done checking for any_errors_fatal 25675 1727203994.23345: checking for max_fail_percentage 25675 1727203994.23347: done checking for max_fail_percentage 25675 1727203994.23348: checking to see if all hosts have failed and the running result is not ok 25675 1727203994.23349: done checking to see if all hosts have failed 25675 1727203994.23350: getting the remaining hosts for this loop 25675 1727203994.23351: done getting the remaining hosts for this loop 25675 1727203994.23354: getting the next task for host managed-node2 25675 1727203994.23360: done getting next task for host managed-node2 25675 1727203994.23363: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727203994.23366: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203994.23382: getting variables 25675 1727203994.23384: in VariableManager get_vars() 25675 1727203994.23418: Calling all_inventory to load vars for managed-node2 25675 1727203994.23421: Calling groups_inventory to load vars for managed-node2 25675 1727203994.23423: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203994.23432: Calling all_plugins_play to load vars for managed-node2 25675 1727203994.23435: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203994.23438: Calling groups_plugins_play to load vars for managed-node2 25675 1727203994.24028: done sending task result for task 028d2410-947f-41bd-b19d-00000000027e 25675 1727203994.24031: WORKER PROCESS EXITING 25675 1727203994.24722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203994.25642: done with get_vars() 25675 1727203994.25657: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:14 -0400 (0:00:02.026) 0:00:13.709 ***** 25675 1727203994.25863: entering _queue_task() for managed-node2/package_facts 25675 1727203994.25866: Creating lock for package_facts 25675 1727203994.26684: worker is 1 (out of 1 available) 25675 1727203994.26698: exiting _queue_task() for managed-node2/package_facts 25675 1727203994.26709: done queuing things up, now waiting for results queue to drain 25675 1727203994.26711: waiting for pending results... 25675 1727203994.27018: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727203994.27154: in run() - task 028d2410-947f-41bd-b19d-00000000027f 25675 1727203994.27182: variable 'ansible_search_path' from source: unknown 25675 1727203994.27191: variable 'ansible_search_path' from source: unknown 25675 1727203994.27234: calling self._execute() 25675 1727203994.27326: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203994.27339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203994.27439: variable 'omit' from source: magic vars 25675 1727203994.27797: variable 'ansible_distribution_major_version' from source: facts 25675 1727203994.27814: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203994.27827: variable 'omit' from source: magic vars 25675 1727203994.27898: variable 'omit' from source: magic vars 25675 1727203994.27938: variable 'omit' from source: magic vars 25675 1727203994.27991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203994.28035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203994.28061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203994.28107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203994.28113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203994.28182: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203994.28185: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203994.28188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203994.28264: Set connection var ansible_shell_type to sh 25675 1727203994.28270: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203994.28280: Set connection var ansible_timeout to 10 25675 1727203994.28286: Set connection var ansible_pipelining to False 25675 1727203994.28323: Set connection var ansible_shell_executable to /bin/sh 25675 1727203994.28331: Set connection var ansible_connection to ssh 25675 1727203994.28334: variable 'ansible_shell_executable' from source: unknown 25675 1727203994.28338: variable 'ansible_connection' from source: unknown 25675 1727203994.28341: variable 'ansible_module_compression' from source: unknown 25675 1727203994.28343: variable 'ansible_shell_type' from source: unknown 25675 1727203994.28345: variable 'ansible_shell_executable' from source: unknown 25675 1727203994.28347: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203994.28349: variable 'ansible_pipelining' from source: unknown 25675 1727203994.28351: variable 'ansible_timeout' from source: unknown 25675 1727203994.28353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203994.28582: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203994.28786: variable 'omit' from source: magic vars 25675 1727203994.28789: starting attempt loop 25675 1727203994.28791: running the handler 25675 1727203994.28806: _low_level_execute_command(): starting 25675 1727203994.28816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203994.30190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203994.30225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.30234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203994.30249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203994.30300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203994.30341: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203994.30344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203994.30347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203994.30363: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727203994.30379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727203994.30418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.30491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203994.30551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203994.30555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203994.30557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.30661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.32359: stdout chunk (state=3): >>>/root <<< 25675 1727203994.32503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203994.32533: stderr chunk (state=3): >>><<< 25675 1727203994.32544: stdout chunk (state=3): >>><<< 25675 1727203994.32663: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203994.32668: _low_level_execute_command(): starting 25675 1727203994.32671: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761 `" && echo ansible-tmp-1727203994.3257265-26740-274036900044761="` echo /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761 `" ) && sleep 0' 25675 1727203994.33192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203994.33216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.33230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203994.33250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203994.33270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203994.33364: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203994.33387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203994.33403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.33509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.35712: stdout chunk (state=3): >>>ansible-tmp-1727203994.3257265-26740-274036900044761=/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761 <<< 25675 1727203994.35722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203994.35735: stdout chunk (state=3): >>><<< 25675 1727203994.35742: stderr chunk (state=3): >>><<< 25675 1727203994.35764: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203994.3257265-26740-274036900044761=/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203994.35820: variable 'ansible_module_compression' from source: unknown 25675 1727203994.35879: ANSIBALLZ: Using lock for package_facts 25675 1727203994.35889: ANSIBALLZ: Acquiring lock 25675 1727203994.35982: ANSIBALLZ: Lock acquired: 139822504008048 25675 1727203994.35985: ANSIBALLZ: Creating module 25675 1727203994.68583: ANSIBALLZ: Writing module into payload 25675 1727203994.69082: ANSIBALLZ: Writing module 25675 1727203994.69086: ANSIBALLZ: Renaming module 25675 1727203994.69088: ANSIBALLZ: Done creating module 25675 1727203994.69090: variable 'ansible_facts' from source: unknown 25675 1727203994.69092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py 25675 1727203994.69201: Sending initial data 25675 1727203994.69204: Sent initial data (162 bytes) 25675 1727203994.69986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203994.69990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.69993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203994.69995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203994.69998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203994.70008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.70300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.71807: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25675 1727203994.71827: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203994.71902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203994.71985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpb_voyhit /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py <<< 25675 1727203994.71989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py" <<< 25675 1727203994.72063: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpb_voyhit" to remote "/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py" <<< 25675 1727203994.74914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203994.74917: stdout chunk (state=3): >>><<< 25675 1727203994.74919: stderr chunk (state=3): >>><<< 25675 1727203994.74940: done transferring module to remote 25675 1727203994.74957: _low_level_execute_command(): starting 25675 1727203994.74961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/ /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py && sleep 0' 25675 1727203994.76004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.76009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203994.76063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.76123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203994.77941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203994.78181: stderr chunk (state=3): >>><<< 25675 1727203994.78185: stdout chunk (state=3): >>><<< 25675 1727203994.78188: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203994.78191: _low_level_execute_command(): starting 25675 1727203994.78193: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/AnsiballZ_package_facts.py && sleep 0' 25675 1727203994.78640: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203994.78644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203994.78653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203994.78667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203994.78685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203994.78691: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727203994.78701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203994.78713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203994.78792: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203994.78800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203994.78811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203994.78843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203994.78924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203995.23585: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 25675 1727203995.23602: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 25675 1727203995.23612: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 25675 1727203995.23685: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25675 1727203995.25461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203995.25478: stdout chunk (state=3): >>><<< 25675 1727203995.25497: stderr chunk (state=3): >>><<< 25675 1727203995.25539: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203995.34175: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203995.34253: _low_level_execute_command(): starting 25675 1727203995.34255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203994.3257265-26740-274036900044761/ > /dev/null 2>&1 && sleep 0' 25675 1727203995.34913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203995.34924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203995.34927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203995.35043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203995.36987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203995.36991: stdout chunk (state=3): >>><<< 25675 1727203995.37003: stderr chunk (state=3): >>><<< 25675 1727203995.37182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203995.37185: handler run complete 25675 1727203995.38183: variable 'ansible_facts' from source: unknown 25675 1727203995.38559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.40403: variable 'ansible_facts' from source: unknown 25675 1727203995.40802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.41493: attempt loop complete, returning result 25675 1727203995.41503: _execute() done 25675 1727203995.41506: dumping result to json 25675 1727203995.41716: done dumping result, returning 25675 1727203995.41725: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-41bd-b19d-00000000027f] 25675 1727203995.41727: sending task result for task 028d2410-947f-41bd-b19d-00000000027f 25675 1727203995.44022: done sending task result for task 028d2410-947f-41bd-b19d-00000000027f 25675 1727203995.44025: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727203995.44113: no more pending results, returning what we have 25675 1727203995.44116: results queue empty 25675 1727203995.44116: checking for any_errors_fatal 25675 1727203995.44121: done checking for any_errors_fatal 25675 1727203995.44122: checking for max_fail_percentage 25675 1727203995.44123: done checking for max_fail_percentage 25675 1727203995.44124: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.44129: done checking to see if all hosts have failed 25675 1727203995.44130: getting the remaining hosts for this loop 25675 1727203995.44131: done getting the remaining hosts for this loop 25675 1727203995.44134: getting the next task for host managed-node2 25675 1727203995.44140: done getting next task for host managed-node2 25675 1727203995.44143: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25675 1727203995.44145: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.44155: getting variables 25675 1727203995.44156: in VariableManager get_vars() 25675 1727203995.44186: Calling all_inventory to load vars for managed-node2 25675 1727203995.44188: Calling groups_inventory to load vars for managed-node2 25675 1727203995.44191: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.44199: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.44202: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.44205: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.45488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.47144: done with get_vars() 25675 1727203995.47175: done getting variables 25675 1727203995.47232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:15 -0400 (0:00:01.214) 0:00:14.923 ***** 25675 1727203995.47267: entering _queue_task() for managed-node2/debug 25675 1727203995.47794: worker is 1 (out of 1 available) 25675 1727203995.47804: exiting _queue_task() for managed-node2/debug 25675 1727203995.47814: done queuing things up, now waiting for results queue to drain 25675 1727203995.47816: waiting for pending results... 25675 1727203995.47968: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 25675 1727203995.48055: in run() - task 028d2410-947f-41bd-b19d-00000000001a 25675 1727203995.48059: variable 'ansible_search_path' from source: unknown 25675 1727203995.48062: variable 'ansible_search_path' from source: unknown 25675 1727203995.48089: calling self._execute() 25675 1727203995.48180: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.48185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.48224: variable 'omit' from source: magic vars 25675 1727203995.48597: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.48602: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.48609: variable 'omit' from source: magic vars 25675 1727203995.48654: variable 'omit' from source: magic vars 25675 1727203995.48753: variable 'network_provider' from source: set_fact 25675 1727203995.48770: variable 'omit' from source: magic vars 25675 1727203995.48814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203995.48852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203995.49180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203995.49184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203995.49186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203995.49189: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203995.49191: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.49194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.49196: Set connection var ansible_shell_type to sh 25675 1727203995.49198: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203995.49200: Set connection var ansible_timeout to 10 25675 1727203995.49202: Set connection var ansible_pipelining to False 25675 1727203995.49205: Set connection var ansible_shell_executable to /bin/sh 25675 1727203995.49207: Set connection var ansible_connection to ssh 25675 1727203995.49209: variable 'ansible_shell_executable' from source: unknown 25675 1727203995.49212: variable 'ansible_connection' from source: unknown 25675 1727203995.49214: variable 'ansible_module_compression' from source: unknown 25675 1727203995.49216: variable 'ansible_shell_type' from source: unknown 25675 1727203995.49218: variable 'ansible_shell_executable' from source: unknown 25675 1727203995.49220: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.49222: variable 'ansible_pipelining' from source: unknown 25675 1727203995.49224: variable 'ansible_timeout' from source: unknown 25675 1727203995.49226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.49259: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203995.49267: variable 'omit' from source: magic vars 25675 1727203995.49275: starting attempt loop 25675 1727203995.49279: running the handler 25675 1727203995.49323: handler run complete 25675 1727203995.49334: attempt loop complete, returning result 25675 1727203995.49338: _execute() done 25675 1727203995.49344: dumping result to json 25675 1727203995.49349: done dumping result, returning 25675 1727203995.49356: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-41bd-b19d-00000000001a] 25675 1727203995.49362: sending task result for task 028d2410-947f-41bd-b19d-00000000001a 25675 1727203995.49442: done sending task result for task 028d2410-947f-41bd-b19d-00000000001a 25675 1727203995.49446: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 25675 1727203995.49503: no more pending results, returning what we have 25675 1727203995.49507: results queue empty 25675 1727203995.49507: checking for any_errors_fatal 25675 1727203995.49517: done checking for any_errors_fatal 25675 1727203995.49518: checking for max_fail_percentage 25675 1727203995.49520: done checking for max_fail_percentage 25675 1727203995.49521: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.49522: done checking to see if all hosts have failed 25675 1727203995.49522: getting the remaining hosts for this loop 25675 1727203995.49524: done getting the remaining hosts for this loop 25675 1727203995.49528: getting the next task for host managed-node2 25675 1727203995.49534: done getting next task for host managed-node2 25675 1727203995.49537: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727203995.49539: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.49548: getting variables 25675 1727203995.49550: in VariableManager get_vars() 25675 1727203995.49588: Calling all_inventory to load vars for managed-node2 25675 1727203995.49591: Calling groups_inventory to load vars for managed-node2 25675 1727203995.49594: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.49604: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.49607: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.49609: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.51282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.52871: done with get_vars() 25675 1727203995.52896: done getting variables 25675 1727203995.52953: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.057) 0:00:14.981 ***** 25675 1727203995.52981: entering _queue_task() for managed-node2/fail 25675 1727203995.53383: worker is 1 (out of 1 available) 25675 1727203995.53394: exiting _queue_task() for managed-node2/fail 25675 1727203995.53403: done queuing things up, now waiting for results queue to drain 25675 1727203995.53404: waiting for pending results... 25675 1727203995.53651: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727203995.53656: in run() - task 028d2410-947f-41bd-b19d-00000000001b 25675 1727203995.53675: variable 'ansible_search_path' from source: unknown 25675 1727203995.53681: variable 'ansible_search_path' from source: unknown 25675 1727203995.53716: calling self._execute() 25675 1727203995.53795: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.53806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.53815: variable 'omit' from source: magic vars 25675 1727203995.54210: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.54307: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.54348: variable 'network_state' from source: role '' defaults 25675 1727203995.54358: Evaluated conditional (network_state != {}): False 25675 1727203995.54362: when evaluation is False, skipping this task 25675 1727203995.54364: _execute() done 25675 1727203995.54367: dumping result to json 25675 1727203995.54369: done dumping result, returning 25675 1727203995.54378: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-41bd-b19d-00000000001b] 25675 1727203995.54383: sending task result for task 028d2410-947f-41bd-b19d-00000000001b 25675 1727203995.54486: done sending task result for task 028d2410-947f-41bd-b19d-00000000001b 25675 1727203995.54488: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727203995.54541: no more pending results, returning what we have 25675 1727203995.54545: results queue empty 25675 1727203995.54546: checking for any_errors_fatal 25675 1727203995.54669: done checking for any_errors_fatal 25675 1727203995.54671: checking for max_fail_percentage 25675 1727203995.54673: done checking for max_fail_percentage 25675 1727203995.54674: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.54675: done checking to see if all hosts have failed 25675 1727203995.54677: getting the remaining hosts for this loop 25675 1727203995.54678: done getting the remaining hosts for this loop 25675 1727203995.54681: getting the next task for host managed-node2 25675 1727203995.54686: done getting next task for host managed-node2 25675 1727203995.54690: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727203995.54692: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.54705: getting variables 25675 1727203995.54707: in VariableManager get_vars() 25675 1727203995.54742: Calling all_inventory to load vars for managed-node2 25675 1727203995.54745: Calling groups_inventory to load vars for managed-node2 25675 1727203995.54747: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.54757: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.54760: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.54763: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.56129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.58757: done with get_vars() 25675 1727203995.58783: done getting variables 25675 1727203995.58843: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.058) 0:00:15.039 ***** 25675 1727203995.58874: entering _queue_task() for managed-node2/fail 25675 1727203995.59586: worker is 1 (out of 1 available) 25675 1727203995.59600: exiting _queue_task() for managed-node2/fail 25675 1727203995.59612: done queuing things up, now waiting for results queue to drain 25675 1727203995.59614: waiting for pending results... 25675 1727203995.60395: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727203995.60400: in run() - task 028d2410-947f-41bd-b19d-00000000001c 25675 1727203995.60403: variable 'ansible_search_path' from source: unknown 25675 1727203995.60406: variable 'ansible_search_path' from source: unknown 25675 1727203995.60449: calling self._execute() 25675 1727203995.60741: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.60745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.60748: variable 'omit' from source: magic vars 25675 1727203995.61467: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.61514: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.61665: variable 'network_state' from source: role '' defaults 25675 1727203995.61679: Evaluated conditional (network_state != {}): False 25675 1727203995.61683: when evaluation is False, skipping this task 25675 1727203995.61685: _execute() done 25675 1727203995.61688: dumping result to json 25675 1727203995.61694: done dumping result, returning 25675 1727203995.61700: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-41bd-b19d-00000000001c] 25675 1727203995.61705: sending task result for task 028d2410-947f-41bd-b19d-00000000001c 25675 1727203995.61802: done sending task result for task 028d2410-947f-41bd-b19d-00000000001c 25675 1727203995.61806: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727203995.61860: no more pending results, returning what we have 25675 1727203995.61865: results queue empty 25675 1727203995.61866: checking for any_errors_fatal 25675 1727203995.61873: done checking for any_errors_fatal 25675 1727203995.61873: checking for max_fail_percentage 25675 1727203995.61877: done checking for max_fail_percentage 25675 1727203995.61879: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.61880: done checking to see if all hosts have failed 25675 1727203995.61880: getting the remaining hosts for this loop 25675 1727203995.61882: done getting the remaining hosts for this loop 25675 1727203995.61886: getting the next task for host managed-node2 25675 1727203995.61892: done getting next task for host managed-node2 25675 1727203995.61896: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727203995.61898: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.61914: getting variables 25675 1727203995.61916: in VariableManager get_vars() 25675 1727203995.61957: Calling all_inventory to load vars for managed-node2 25675 1727203995.61961: Calling groups_inventory to load vars for managed-node2 25675 1727203995.61963: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.62190: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.62195: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.62199: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.63612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.66033: done with get_vars() 25675 1727203995.66067: done getting variables 25675 1727203995.66130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.072) 0:00:15.112 ***** 25675 1727203995.66168: entering _queue_task() for managed-node2/fail 25675 1727203995.66486: worker is 1 (out of 1 available) 25675 1727203995.66501: exiting _queue_task() for managed-node2/fail 25675 1727203995.66513: done queuing things up, now waiting for results queue to drain 25675 1727203995.66514: waiting for pending results... 25675 1727203995.66754: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727203995.66882: in run() - task 028d2410-947f-41bd-b19d-00000000001d 25675 1727203995.66885: variable 'ansible_search_path' from source: unknown 25675 1727203995.66887: variable 'ansible_search_path' from source: unknown 25675 1727203995.66922: calling self._execute() 25675 1727203995.67011: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.67030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.67081: variable 'omit' from source: magic vars 25675 1727203995.67446: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.67470: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.67703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203995.70268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203995.70344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203995.70388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203995.70428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203995.70453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203995.70536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.70563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.70596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.70650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.70666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.70768: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.70785: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25675 1727203995.70904: variable 'ansible_distribution' from source: facts 25675 1727203995.70907: variable '__network_rh_distros' from source: role '' defaults 25675 1727203995.70917: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25675 1727203995.71191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.71220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.71244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.71290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.71304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.71355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.71383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.71459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.71463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.71466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.71496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.71517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.71542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.71579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.71594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.71893: variable 'network_connections' from source: play vars 25675 1727203995.71900: variable 'interface' from source: set_fact 25675 1727203995.72004: variable 'interface' from source: set_fact 25675 1727203995.72007: variable 'interface' from source: set_fact 25675 1727203995.72040: variable 'interface' from source: set_fact 25675 1727203995.72050: variable 'network_state' from source: role '' defaults 25675 1727203995.72112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203995.72279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203995.72582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203995.72585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203995.72588: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203995.72590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203995.72600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203995.72602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.72604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203995.72606: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25675 1727203995.72608: when evaluation is False, skipping this task 25675 1727203995.72611: _execute() done 25675 1727203995.72613: dumping result to json 25675 1727203995.72616: done dumping result, returning 25675 1727203995.72619: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-41bd-b19d-00000000001d] 25675 1727203995.72622: sending task result for task 028d2410-947f-41bd-b19d-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25675 1727203995.72736: no more pending results, returning what we have 25675 1727203995.72740: results queue empty 25675 1727203995.72741: checking for any_errors_fatal 25675 1727203995.72748: done checking for any_errors_fatal 25675 1727203995.72749: checking for max_fail_percentage 25675 1727203995.72751: done checking for max_fail_percentage 25675 1727203995.72752: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.72753: done checking to see if all hosts have failed 25675 1727203995.72753: getting the remaining hosts for this loop 25675 1727203995.72755: done getting the remaining hosts for this loop 25675 1727203995.72759: getting the next task for host managed-node2 25675 1727203995.72765: done getting next task for host managed-node2 25675 1727203995.72769: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727203995.72771: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.72889: getting variables 25675 1727203995.72891: in VariableManager get_vars() 25675 1727203995.72928: Calling all_inventory to load vars for managed-node2 25675 1727203995.72931: Calling groups_inventory to load vars for managed-node2 25675 1727203995.72933: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.72943: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.72946: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.72950: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.73492: done sending task result for task 028d2410-947f-41bd-b19d-00000000001d 25675 1727203995.73496: WORKER PROCESS EXITING 25675 1727203995.75635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.77697: done with get_vars() 25675 1727203995.77727: done getting variables 25675 1727203995.77837: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.116) 0:00:15.229 ***** 25675 1727203995.77874: entering _queue_task() for managed-node2/dnf 25675 1727203995.78237: worker is 1 (out of 1 available) 25675 1727203995.78249: exiting _queue_task() for managed-node2/dnf 25675 1727203995.78260: done queuing things up, now waiting for results queue to drain 25675 1727203995.78262: waiting for pending results... 25675 1727203995.78595: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727203995.78649: in run() - task 028d2410-947f-41bd-b19d-00000000001e 25675 1727203995.78663: variable 'ansible_search_path' from source: unknown 25675 1727203995.78667: variable 'ansible_search_path' from source: unknown 25675 1727203995.78704: calling self._execute() 25675 1727203995.78792: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.78850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.78854: variable 'omit' from source: magic vars 25675 1727203995.79511: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.79522: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.79766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203995.84524: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203995.84682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203995.84727: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203995.84761: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203995.84903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203995.84985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.85184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.85189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.85192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.85404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.85408: variable 'ansible_distribution' from source: facts 25675 1727203995.85410: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.85412: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25675 1727203995.85680: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203995.85713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.85734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.85761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.85841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.85854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.85956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.85986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.86014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.86050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.86063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.86183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203995.86186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203995.86188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.86230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203995.86245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203995.86464: variable 'network_connections' from source: play vars 25675 1727203995.86484: variable 'interface' from source: set_fact 25675 1727203995.86670: variable 'interface' from source: set_fact 25675 1727203995.86739: variable 'interface' from source: set_fact 25675 1727203995.86869: variable 'interface' from source: set_fact 25675 1727203995.86950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203995.87407: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203995.87438: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203995.87541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203995.87615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203995.87680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203995.87712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203995.87753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203995.87788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203995.87855: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727203995.88114: variable 'network_connections' from source: play vars 25675 1727203995.88126: variable 'interface' from source: set_fact 25675 1727203995.88254: variable 'interface' from source: set_fact 25675 1727203995.88257: variable 'interface' from source: set_fact 25675 1727203995.88265: variable 'interface' from source: set_fact 25675 1727203995.88303: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727203995.88310: when evaluation is False, skipping this task 25675 1727203995.88316: _execute() done 25675 1727203995.88321: dumping result to json 25675 1727203995.88327: done dumping result, returning 25675 1727203995.88336: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-00000000001e] 25675 1727203995.88343: sending task result for task 028d2410-947f-41bd-b19d-00000000001e skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727203995.88540: no more pending results, returning what we have 25675 1727203995.88544: results queue empty 25675 1727203995.88545: checking for any_errors_fatal 25675 1727203995.88551: done checking for any_errors_fatal 25675 1727203995.88552: checking for max_fail_percentage 25675 1727203995.88554: done checking for max_fail_percentage 25675 1727203995.88555: checking to see if all hosts have failed and the running result is not ok 25675 1727203995.88556: done checking to see if all hosts have failed 25675 1727203995.88556: getting the remaining hosts for this loop 25675 1727203995.88558: done getting the remaining hosts for this loop 25675 1727203995.88562: getting the next task for host managed-node2 25675 1727203995.88568: done getting next task for host managed-node2 25675 1727203995.88572: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727203995.88574: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203995.88592: getting variables 25675 1727203995.88594: in VariableManager get_vars() 25675 1727203995.88632: Calling all_inventory to load vars for managed-node2 25675 1727203995.88635: Calling groups_inventory to load vars for managed-node2 25675 1727203995.88637: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203995.88648: Calling all_plugins_play to load vars for managed-node2 25675 1727203995.88651: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203995.88653: Calling groups_plugins_play to load vars for managed-node2 25675 1727203995.89335: done sending task result for task 028d2410-947f-41bd-b19d-00000000001e 25675 1727203995.89339: WORKER PROCESS EXITING 25675 1727203995.90664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203995.94422: done with get_vars() 25675 1727203995.94779: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727203995.94984: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.171) 0:00:15.402 ***** 25675 1727203995.95143: entering _queue_task() for managed-node2/yum 25675 1727203995.95147: Creating lock for yum 25675 1727203995.96096: worker is 1 (out of 1 available) 25675 1727203995.96109: exiting _queue_task() for managed-node2/yum 25675 1727203995.96127: done queuing things up, now waiting for results queue to drain 25675 1727203995.96128: waiting for pending results... 25675 1727203995.96538: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727203995.96759: in run() - task 028d2410-947f-41bd-b19d-00000000001f 25675 1727203995.96991: variable 'ansible_search_path' from source: unknown 25675 1727203995.96995: variable 'ansible_search_path' from source: unknown 25675 1727203995.96998: calling self._execute() 25675 1727203995.97123: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203995.97136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203995.97150: variable 'omit' from source: magic vars 25675 1727203995.97834: variable 'ansible_distribution_major_version' from source: facts 25675 1727203995.97988: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203995.98416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203996.02641: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203996.03693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203996.03696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203996.03699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203996.03701: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203996.03930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.03966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.04045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.04095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.04198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.04585: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.04782: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25675 1727203996.04786: when evaluation is False, skipping this task 25675 1727203996.04788: _execute() done 25675 1727203996.04790: dumping result to json 25675 1727203996.04792: done dumping result, returning 25675 1727203996.04795: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-00000000001f] 25675 1727203996.04797: sending task result for task 028d2410-947f-41bd-b19d-00000000001f 25675 1727203996.04870: done sending task result for task 028d2410-947f-41bd-b19d-00000000001f 25675 1727203996.04873: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25675 1727203996.04943: no more pending results, returning what we have 25675 1727203996.04947: results queue empty 25675 1727203996.04948: checking for any_errors_fatal 25675 1727203996.04953: done checking for any_errors_fatal 25675 1727203996.04954: checking for max_fail_percentage 25675 1727203996.04955: done checking for max_fail_percentage 25675 1727203996.04956: checking to see if all hosts have failed and the running result is not ok 25675 1727203996.04957: done checking to see if all hosts have failed 25675 1727203996.04958: getting the remaining hosts for this loop 25675 1727203996.04959: done getting the remaining hosts for this loop 25675 1727203996.04964: getting the next task for host managed-node2 25675 1727203996.04971: done getting next task for host managed-node2 25675 1727203996.04974: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727203996.04979: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203996.04995: getting variables 25675 1727203996.04997: in VariableManager get_vars() 25675 1727203996.05037: Calling all_inventory to load vars for managed-node2 25675 1727203996.05041: Calling groups_inventory to load vars for managed-node2 25675 1727203996.05043: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203996.05054: Calling all_plugins_play to load vars for managed-node2 25675 1727203996.05057: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203996.05061: Calling groups_plugins_play to load vars for managed-node2 25675 1727203996.17700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203996.20967: done with get_vars() 25675 1727203996.21104: done getting variables 25675 1727203996.21160: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.260) 0:00:15.663 ***** 25675 1727203996.21234: entering _queue_task() for managed-node2/fail 25675 1727203996.22127: worker is 1 (out of 1 available) 25675 1727203996.22138: exiting _queue_task() for managed-node2/fail 25675 1727203996.22148: done queuing things up, now waiting for results queue to drain 25675 1727203996.22150: waiting for pending results... 25675 1727203996.22756: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727203996.22762: in run() - task 028d2410-947f-41bd-b19d-000000000020 25675 1727203996.22766: variable 'ansible_search_path' from source: unknown 25675 1727203996.22774: variable 'ansible_search_path' from source: unknown 25675 1727203996.22857: calling self._execute() 25675 1727203996.22983: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203996.23081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203996.23282: variable 'omit' from source: magic vars 25675 1727203996.23981: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.23985: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203996.24143: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203996.24561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203996.29226: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203996.29409: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203996.29452: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203996.29487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203996.29601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203996.29768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.29799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.29829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.30081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.30084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.30158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.30188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.30214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.30252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.30383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.30429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.30452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.30479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.30636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.30652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.31066: variable 'network_connections' from source: play vars 25675 1727203996.31070: variable 'interface' from source: set_fact 25675 1727203996.31229: variable 'interface' from source: set_fact 25675 1727203996.31233: variable 'interface' from source: set_fact 25675 1727203996.31425: variable 'interface' from source: set_fact 25675 1727203996.31455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203996.31880: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203996.32018: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203996.32281: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203996.32285: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203996.32287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203996.32481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203996.32485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.32488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203996.32881: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727203996.33038: variable 'network_connections' from source: play vars 25675 1727203996.33041: variable 'interface' from source: set_fact 25675 1727203996.33107: variable 'interface' from source: set_fact 25675 1727203996.33279: variable 'interface' from source: set_fact 25675 1727203996.33336: variable 'interface' from source: set_fact 25675 1727203996.33368: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727203996.33374: when evaluation is False, skipping this task 25675 1727203996.33378: _execute() done 25675 1727203996.33487: dumping result to json 25675 1727203996.33499: done dumping result, returning 25675 1727203996.33508: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000020] 25675 1727203996.33520: sending task result for task 028d2410-947f-41bd-b19d-000000000020 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727203996.33666: no more pending results, returning what we have 25675 1727203996.33670: results queue empty 25675 1727203996.33671: checking for any_errors_fatal 25675 1727203996.33679: done checking for any_errors_fatal 25675 1727203996.33680: checking for max_fail_percentage 25675 1727203996.33683: done checking for max_fail_percentage 25675 1727203996.33683: checking to see if all hosts have failed and the running result is not ok 25675 1727203996.33684: done checking to see if all hosts have failed 25675 1727203996.33685: getting the remaining hosts for this loop 25675 1727203996.33686: done getting the remaining hosts for this loop 25675 1727203996.33690: getting the next task for host managed-node2 25675 1727203996.33696: done getting next task for host managed-node2 25675 1727203996.33700: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25675 1727203996.33702: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203996.33888: getting variables 25675 1727203996.33891: in VariableManager get_vars() 25675 1727203996.33930: Calling all_inventory to load vars for managed-node2 25675 1727203996.33933: Calling groups_inventory to load vars for managed-node2 25675 1727203996.33935: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203996.33945: Calling all_plugins_play to load vars for managed-node2 25675 1727203996.33948: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203996.33950: Calling groups_plugins_play to load vars for managed-node2 25675 1727203996.34686: done sending task result for task 028d2410-947f-41bd-b19d-000000000020 25675 1727203996.34694: WORKER PROCESS EXITING 25675 1727203996.36910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203996.40298: done with get_vars() 25675 1727203996.40440: done getting variables 25675 1727203996.40503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.192) 0:00:15.856 ***** 25675 1727203996.40533: entering _queue_task() for managed-node2/package 25675 1727203996.41452: worker is 1 (out of 1 available) 25675 1727203996.41463: exiting _queue_task() for managed-node2/package 25675 1727203996.41477: done queuing things up, now waiting for results queue to drain 25675 1727203996.41479: waiting for pending results... 25675 1727203996.41834: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 25675 1727203996.42030: in run() - task 028d2410-947f-41bd-b19d-000000000021 25675 1727203996.42053: variable 'ansible_search_path' from source: unknown 25675 1727203996.42057: variable 'ansible_search_path' from source: unknown 25675 1727203996.42083: calling self._execute() 25675 1727203996.42389: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203996.42393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203996.42395: variable 'omit' from source: magic vars 25675 1727203996.43281: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.43284: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203996.43531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203996.44086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203996.44189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203996.44380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203996.44428: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203996.44719: variable 'network_packages' from source: role '' defaults 25675 1727203996.44940: variable '__network_provider_setup' from source: role '' defaults 25675 1727203996.44970: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727203996.45019: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727203996.45027: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727203996.45303: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727203996.45635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203996.50470: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203996.50541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203996.50877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203996.50880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203996.50883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203996.50976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.51003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.51035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.51183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.51200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.51343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.51346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.51357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.51449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.51467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.52103: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727203996.52295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.52321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.52389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.52880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.52884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.52886: variable 'ansible_python' from source: facts 25675 1727203996.52888: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727203996.52942: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727203996.53088: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727203996.53480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.53483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.53486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.53545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.53549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.53691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203996.53713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203996.53741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.53780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203996.53910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203996.54177: variable 'network_connections' from source: play vars 25675 1727203996.54181: variable 'interface' from source: set_fact 25675 1727203996.54397: variable 'interface' from source: set_fact 25675 1727203996.54406: variable 'interface' from source: set_fact 25675 1727203996.54618: variable 'interface' from source: set_fact 25675 1727203996.54917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203996.54920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203996.54922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203996.55001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203996.55181: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203996.55702: variable 'network_connections' from source: play vars 25675 1727203996.55716: variable 'interface' from source: set_fact 25675 1727203996.55959: variable 'interface' from source: set_fact 25675 1727203996.55969: variable 'interface' from source: set_fact 25675 1727203996.56126: variable 'interface' from source: set_fact 25675 1727203996.56240: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727203996.56411: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203996.57059: variable 'network_connections' from source: play vars 25675 1727203996.57062: variable 'interface' from source: set_fact 25675 1727203996.57246: variable 'interface' from source: set_fact 25675 1727203996.57253: variable 'interface' from source: set_fact 25675 1727203996.57437: variable 'interface' from source: set_fact 25675 1727203996.57470: variable '__network_packages_default_team' from source: role '' defaults 25675 1727203996.57644: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727203996.58323: variable 'network_connections' from source: play vars 25675 1727203996.58326: variable 'interface' from source: set_fact 25675 1727203996.58503: variable 'interface' from source: set_fact 25675 1727203996.58509: variable 'interface' from source: set_fact 25675 1727203996.58580: variable 'interface' from source: set_fact 25675 1727203996.58775: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727203996.59034: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727203996.59037: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727203996.59039: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727203996.59530: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727203996.60441: variable 'network_connections' from source: play vars 25675 1727203996.60445: variable 'interface' from source: set_fact 25675 1727203996.60751: variable 'interface' from source: set_fact 25675 1727203996.60755: variable 'interface' from source: set_fact 25675 1727203996.60757: variable 'interface' from source: set_fact 25675 1727203996.60791: variable 'ansible_distribution' from source: facts 25675 1727203996.60799: variable '__network_rh_distros' from source: role '' defaults 25675 1727203996.60805: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.60829: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727203996.61246: variable 'ansible_distribution' from source: facts 25675 1727203996.61250: variable '__network_rh_distros' from source: role '' defaults 25675 1727203996.61255: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.61277: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727203996.61549: variable 'ansible_distribution' from source: facts 25675 1727203996.61553: variable '__network_rh_distros' from source: role '' defaults 25675 1727203996.61679: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.61717: variable 'network_provider' from source: set_fact 25675 1727203996.61732: variable 'ansible_facts' from source: unknown 25675 1727203996.63581: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25675 1727203996.63584: when evaluation is False, skipping this task 25675 1727203996.63587: _execute() done 25675 1727203996.63589: dumping result to json 25675 1727203996.63591: done dumping result, returning 25675 1727203996.63594: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-41bd-b19d-000000000021] 25675 1727203996.63596: sending task result for task 028d2410-947f-41bd-b19d-000000000021 25675 1727203996.63659: done sending task result for task 028d2410-947f-41bd-b19d-000000000021 25675 1727203996.63661: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25675 1727203996.63710: no more pending results, returning what we have 25675 1727203996.63714: results queue empty 25675 1727203996.63714: checking for any_errors_fatal 25675 1727203996.63834: done checking for any_errors_fatal 25675 1727203996.63836: checking for max_fail_percentage 25675 1727203996.63838: done checking for max_fail_percentage 25675 1727203996.63838: checking to see if all hosts have failed and the running result is not ok 25675 1727203996.63839: done checking to see if all hosts have failed 25675 1727203996.63840: getting the remaining hosts for this loop 25675 1727203996.63841: done getting the remaining hosts for this loop 25675 1727203996.63845: getting the next task for host managed-node2 25675 1727203996.63850: done getting next task for host managed-node2 25675 1727203996.63853: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727203996.63854: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203996.63868: getting variables 25675 1727203996.63869: in VariableManager get_vars() 25675 1727203996.63905: Calling all_inventory to load vars for managed-node2 25675 1727203996.63908: Calling groups_inventory to load vars for managed-node2 25675 1727203996.63910: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203996.63924: Calling all_plugins_play to load vars for managed-node2 25675 1727203996.63927: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203996.63929: Calling groups_plugins_play to load vars for managed-node2 25675 1727203996.67257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203996.70712: done with get_vars() 25675 1727203996.70740: done getting variables 25675 1727203996.70800: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.304) 0:00:16.160 ***** 25675 1727203996.70944: entering _queue_task() for managed-node2/package 25675 1727203996.71564: worker is 1 (out of 1 available) 25675 1727203996.71882: exiting _queue_task() for managed-node2/package 25675 1727203996.71893: done queuing things up, now waiting for results queue to drain 25675 1727203996.71895: waiting for pending results... 25675 1727203996.72097: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727203996.72360: in run() - task 028d2410-947f-41bd-b19d-000000000022 25675 1727203996.72378: variable 'ansible_search_path' from source: unknown 25675 1727203996.72382: variable 'ansible_search_path' from source: unknown 25675 1727203996.72418: calling self._execute() 25675 1727203996.72631: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203996.72634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203996.72681: variable 'omit' from source: magic vars 25675 1727203996.73522: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.73528: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203996.74280: variable 'network_state' from source: role '' defaults 25675 1727203996.74284: Evaluated conditional (network_state != {}): False 25675 1727203996.74286: when evaluation is False, skipping this task 25675 1727203996.74288: _execute() done 25675 1727203996.74289: dumping result to json 25675 1727203996.74291: done dumping result, returning 25675 1727203996.74295: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000022] 25675 1727203996.74298: sending task result for task 028d2410-947f-41bd-b19d-000000000022 25675 1727203996.74365: done sending task result for task 028d2410-947f-41bd-b19d-000000000022 25675 1727203996.74369: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727203996.74409: no more pending results, returning what we have 25675 1727203996.74412: results queue empty 25675 1727203996.74413: checking for any_errors_fatal 25675 1727203996.74418: done checking for any_errors_fatal 25675 1727203996.74418: checking for max_fail_percentage 25675 1727203996.74420: done checking for max_fail_percentage 25675 1727203996.74421: checking to see if all hosts have failed and the running result is not ok 25675 1727203996.74422: done checking to see if all hosts have failed 25675 1727203996.74423: getting the remaining hosts for this loop 25675 1727203996.74424: done getting the remaining hosts for this loop 25675 1727203996.74427: getting the next task for host managed-node2 25675 1727203996.74432: done getting next task for host managed-node2 25675 1727203996.74435: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727203996.74437: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203996.74451: getting variables 25675 1727203996.74452: in VariableManager get_vars() 25675 1727203996.74487: Calling all_inventory to load vars for managed-node2 25675 1727203996.74490: Calling groups_inventory to load vars for managed-node2 25675 1727203996.74493: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203996.74501: Calling all_plugins_play to load vars for managed-node2 25675 1727203996.74503: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203996.74507: Calling groups_plugins_play to load vars for managed-node2 25675 1727203996.77273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203996.81027: done with get_vars() 25675 1727203996.81134: done getting variables 25675 1727203996.81195: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.102) 0:00:16.263 ***** 25675 1727203996.81223: entering _queue_task() for managed-node2/package 25675 1727203996.81882: worker is 1 (out of 1 available) 25675 1727203996.82033: exiting _queue_task() for managed-node2/package 25675 1727203996.82044: done queuing things up, now waiting for results queue to drain 25675 1727203996.82045: waiting for pending results... 25675 1727203996.82578: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727203996.82647: in run() - task 028d2410-947f-41bd-b19d-000000000023 25675 1727203996.82981: variable 'ansible_search_path' from source: unknown 25675 1727203996.82985: variable 'ansible_search_path' from source: unknown 25675 1727203996.82987: calling self._execute() 25675 1727203996.83015: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203996.83023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203996.83033: variable 'omit' from source: magic vars 25675 1727203996.83822: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.83832: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203996.84392: variable 'network_state' from source: role '' defaults 25675 1727203996.84402: Evaluated conditional (network_state != {}): False 25675 1727203996.84405: when evaluation is False, skipping this task 25675 1727203996.84408: _execute() done 25675 1727203996.84410: dumping result to json 25675 1727203996.84414: done dumping result, returning 25675 1727203996.84422: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000023] 25675 1727203996.84428: sending task result for task 028d2410-947f-41bd-b19d-000000000023 25675 1727203996.84537: done sending task result for task 028d2410-947f-41bd-b19d-000000000023 25675 1727203996.84541: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727203996.84593: no more pending results, returning what we have 25675 1727203996.84597: results queue empty 25675 1727203996.84598: checking for any_errors_fatal 25675 1727203996.84604: done checking for any_errors_fatal 25675 1727203996.84605: checking for max_fail_percentage 25675 1727203996.84607: done checking for max_fail_percentage 25675 1727203996.84608: checking to see if all hosts have failed and the running result is not ok 25675 1727203996.84609: done checking to see if all hosts have failed 25675 1727203996.84609: getting the remaining hosts for this loop 25675 1727203996.84611: done getting the remaining hosts for this loop 25675 1727203996.84614: getting the next task for host managed-node2 25675 1727203996.84621: done getting next task for host managed-node2 25675 1727203996.84625: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727203996.84627: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203996.84643: getting variables 25675 1727203996.84644: in VariableManager get_vars() 25675 1727203996.84687: Calling all_inventory to load vars for managed-node2 25675 1727203996.84690: Calling groups_inventory to load vars for managed-node2 25675 1727203996.84692: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203996.84704: Calling all_plugins_play to load vars for managed-node2 25675 1727203996.84706: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203996.84709: Calling groups_plugins_play to load vars for managed-node2 25675 1727203996.88610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203996.92141: done with get_vars() 25675 1727203996.92172: done getting variables 25675 1727203996.92539: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.113) 0:00:16.376 ***** 25675 1727203996.92567: entering _queue_task() for managed-node2/service 25675 1727203996.92569: Creating lock for service 25675 1727203996.93429: worker is 1 (out of 1 available) 25675 1727203996.93440: exiting _queue_task() for managed-node2/service 25675 1727203996.93451: done queuing things up, now waiting for results queue to drain 25675 1727203996.93452: waiting for pending results... 25675 1727203996.93993: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727203996.94004: in run() - task 028d2410-947f-41bd-b19d-000000000024 25675 1727203996.94109: variable 'ansible_search_path' from source: unknown 25675 1727203996.94113: variable 'ansible_search_path' from source: unknown 25675 1727203996.94187: calling self._execute() 25675 1727203996.94405: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203996.94411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203996.94421: variable 'omit' from source: magic vars 25675 1727203996.95481: variable 'ansible_distribution_major_version' from source: facts 25675 1727203996.95485: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203996.95569: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203996.95857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203997.00343: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203997.00533: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203997.00574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203997.00707: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203997.00851: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203997.00929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.01117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.01120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.01123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.01131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.01296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.01318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.01342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.01497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.01512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.01551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.01579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.01713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.01751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.01769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.02248: variable 'network_connections' from source: play vars 25675 1727203997.02281: variable 'interface' from source: set_fact 25675 1727203997.02337: variable 'interface' from source: set_fact 25675 1727203997.02347: variable 'interface' from source: set_fact 25675 1727203997.02521: variable 'interface' from source: set_fact 25675 1727203997.02692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203997.03714: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203997.03751: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203997.03980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203997.03985: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203997.03987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203997.03990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203997.04005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.04149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203997.04267: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727203997.04734: variable 'network_connections' from source: play vars 25675 1727203997.04763: variable 'interface' from source: set_fact 25675 1727203997.04911: variable 'interface' from source: set_fact 25675 1727203997.04918: variable 'interface' from source: set_fact 25675 1727203997.04983: variable 'interface' from source: set_fact 25675 1727203997.05118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727203997.05121: when evaluation is False, skipping this task 25675 1727203997.05124: _execute() done 25675 1727203997.05126: dumping result to json 25675 1727203997.05128: done dumping result, returning 25675 1727203997.05137: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000024] 25675 1727203997.05148: sending task result for task 028d2410-947f-41bd-b19d-000000000024 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727203997.05295: no more pending results, returning what we have 25675 1727203997.05299: results queue empty 25675 1727203997.05300: checking for any_errors_fatal 25675 1727203997.05307: done checking for any_errors_fatal 25675 1727203997.05308: checking for max_fail_percentage 25675 1727203997.05310: done checking for max_fail_percentage 25675 1727203997.05311: checking to see if all hosts have failed and the running result is not ok 25675 1727203997.05312: done checking to see if all hosts have failed 25675 1727203997.05313: getting the remaining hosts for this loop 25675 1727203997.05314: done getting the remaining hosts for this loop 25675 1727203997.05320: getting the next task for host managed-node2 25675 1727203997.05326: done getting next task for host managed-node2 25675 1727203997.05330: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727203997.05332: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203997.05346: getting variables 25675 1727203997.05348: in VariableManager get_vars() 25675 1727203997.05389: Calling all_inventory to load vars for managed-node2 25675 1727203997.05392: Calling groups_inventory to load vars for managed-node2 25675 1727203997.05395: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203997.05405: Calling all_plugins_play to load vars for managed-node2 25675 1727203997.05409: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203997.05412: Calling groups_plugins_play to load vars for managed-node2 25675 1727203997.06080: done sending task result for task 028d2410-947f-41bd-b19d-000000000024 25675 1727203997.06085: WORKER PROCESS EXITING 25675 1727203997.08953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203997.12632: done with get_vars() 25675 1727203997.12659: done getting variables 25675 1727203997.12726: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.201) 0:00:16.578 ***** 25675 1727203997.12755: entering _queue_task() for managed-node2/service 25675 1727203997.13478: worker is 1 (out of 1 available) 25675 1727203997.13490: exiting _queue_task() for managed-node2/service 25675 1727203997.13499: done queuing things up, now waiting for results queue to drain 25675 1727203997.13500: waiting for pending results... 25675 1727203997.14094: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727203997.14324: in run() - task 028d2410-947f-41bd-b19d-000000000025 25675 1727203997.14328: variable 'ansible_search_path' from source: unknown 25675 1727203997.14331: variable 'ansible_search_path' from source: unknown 25675 1727203997.14333: calling self._execute() 25675 1727203997.14494: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203997.14506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203997.14520: variable 'omit' from source: magic vars 25675 1727203997.15343: variable 'ansible_distribution_major_version' from source: facts 25675 1727203997.15359: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203997.15706: variable 'network_provider' from source: set_fact 25675 1727203997.15745: variable 'network_state' from source: role '' defaults 25675 1727203997.15761: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25675 1727203997.15952: variable 'omit' from source: magic vars 25675 1727203997.15955: variable 'omit' from source: magic vars 25675 1727203997.15958: variable 'network_service_name' from source: role '' defaults 25675 1727203997.16113: variable 'network_service_name' from source: role '' defaults 25675 1727203997.16338: variable '__network_provider_setup' from source: role '' defaults 25675 1727203997.16350: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727203997.16608: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727203997.16611: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727203997.16614: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727203997.17095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203997.22860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203997.23482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203997.23486: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203997.23488: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203997.23491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203997.24081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.24085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.24089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.24092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.24094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.24480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.24484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.24487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.24490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.24493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.24755: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727203997.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.25218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.25242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.25680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.25683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.25685: variable 'ansible_python' from source: facts 25675 1727203997.25687: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727203997.25937: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727203997.26020: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727203997.26880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.26884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.26887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.26889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.26891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.26894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203997.26903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203997.27588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.27592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203997.27595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203997.27718: variable 'network_connections' from source: play vars 25675 1727203997.27731: variable 'interface' from source: set_fact 25675 1727203997.27812: variable 'interface' from source: set_fact 25675 1727203997.27908: variable 'interface' from source: set_fact 25675 1727203997.28089: variable 'interface' from source: set_fact 25675 1727203997.28202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203997.28581: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203997.28830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203997.28880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203997.28925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203997.29242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203997.29282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203997.29319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203997.29357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203997.29412: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203997.30285: variable 'network_connections' from source: play vars 25675 1727203997.30489: variable 'interface' from source: set_fact 25675 1727203997.30557: variable 'interface' from source: set_fact 25675 1727203997.31180: variable 'interface' from source: set_fact 25675 1727203997.31183: variable 'interface' from source: set_fact 25675 1727203997.31186: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727203997.31647: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203997.32420: variable 'network_connections' from source: play vars 25675 1727203997.32432: variable 'interface' from source: set_fact 25675 1727203997.32511: variable 'interface' from source: set_fact 25675 1727203997.32783: variable 'interface' from source: set_fact 25675 1727203997.32852: variable 'interface' from source: set_fact 25675 1727203997.33481: variable '__network_packages_default_team' from source: role '' defaults 25675 1727203997.33485: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727203997.34586: variable 'network_connections' from source: play vars 25675 1727203997.34589: variable 'interface' from source: set_fact 25675 1727203997.34592: variable 'interface' from source: set_fact 25675 1727203997.34594: variable 'interface' from source: set_fact 25675 1727203997.34663: variable 'interface' from source: set_fact 25675 1727203997.34950: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727203997.35019: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727203997.35190: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727203997.35253: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727203997.36381: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727203997.37281: variable 'network_connections' from source: play vars 25675 1727203997.37488: variable 'interface' from source: set_fact 25675 1727203997.37636: variable 'interface' from source: set_fact 25675 1727203997.37789: variable 'interface' from source: set_fact 25675 1727203997.37845: variable 'interface' from source: set_fact 25675 1727203997.38180: variable 'ansible_distribution' from source: facts 25675 1727203997.38183: variable '__network_rh_distros' from source: role '' defaults 25675 1727203997.38185: variable 'ansible_distribution_major_version' from source: facts 25675 1727203997.38187: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727203997.38492: variable 'ansible_distribution' from source: facts 25675 1727203997.38687: variable '__network_rh_distros' from source: role '' defaults 25675 1727203997.38698: variable 'ansible_distribution_major_version' from source: facts 25675 1727203997.38715: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727203997.39095: variable 'ansible_distribution' from source: facts 25675 1727203997.39287: variable '__network_rh_distros' from source: role '' defaults 25675 1727203997.39298: variable 'ansible_distribution_major_version' from source: facts 25675 1727203997.39343: variable 'network_provider' from source: set_fact 25675 1727203997.39780: variable 'omit' from source: magic vars 25675 1727203997.39783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203997.39786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203997.39788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203997.39790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203997.39793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203997.39795: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203997.39797: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203997.39799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203997.40244: Set connection var ansible_shell_type to sh 25675 1727203997.40256: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203997.40267: Set connection var ansible_timeout to 10 25675 1727203997.40283: Set connection var ansible_pipelining to False 25675 1727203997.40294: Set connection var ansible_shell_executable to /bin/sh 25675 1727203997.40300: Set connection var ansible_connection to ssh 25675 1727203997.40334: variable 'ansible_shell_executable' from source: unknown 25675 1727203997.40680: variable 'ansible_connection' from source: unknown 25675 1727203997.40684: variable 'ansible_module_compression' from source: unknown 25675 1727203997.40686: variable 'ansible_shell_type' from source: unknown 25675 1727203997.40689: variable 'ansible_shell_executable' from source: unknown 25675 1727203997.40691: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203997.40698: variable 'ansible_pipelining' from source: unknown 25675 1727203997.40700: variable 'ansible_timeout' from source: unknown 25675 1727203997.40702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203997.40917: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727203997.40933: variable 'omit' from source: magic vars 25675 1727203997.40946: starting attempt loop 25675 1727203997.40953: running the handler 25675 1727203997.41040: variable 'ansible_facts' from source: unknown 25675 1727203997.43481: _low_level_execute_command(): starting 25675 1727203997.43782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203997.45696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203997.46195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203997.47787: stdout chunk (state=3): >>>/root <<< 25675 1727203997.47886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203997.47922: stderr chunk (state=3): >>><<< 25675 1727203997.48100: stdout chunk (state=3): >>><<< 25675 1727203997.48207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203997.48211: _low_level_execute_command(): starting 25675 1727203997.48214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195 `" && echo ansible-tmp-1727203997.4812143-26969-115780157060195="` echo /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195 `" ) && sleep 0' 25675 1727203997.49229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203997.49273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203997.49293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203997.49492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203997.49525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203997.49707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203997.49937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203997.52011: stdout chunk (state=3): >>>ansible-tmp-1727203997.4812143-26969-115780157060195=/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195 <<< 25675 1727203997.52095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203997.52110: stdout chunk (state=3): >>><<< 25675 1727203997.52122: stderr chunk (state=3): >>><<< 25675 1727203997.52141: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203997.4812143-26969-115780157060195=/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203997.52250: variable 'ansible_module_compression' from source: unknown 25675 1727203997.52312: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 25675 1727203997.52487: ANSIBALLZ: Acquiring lock 25675 1727203997.52583: ANSIBALLZ: Lock acquired: 139822507557424 25675 1727203997.52586: ANSIBALLZ: Creating module 25675 1727203998.35196: ANSIBALLZ: Writing module into payload 25675 1727203998.35782: ANSIBALLZ: Writing module 25675 1727203998.35879: ANSIBALLZ: Renaming module 25675 1727203998.35958: ANSIBALLZ: Done creating module 25675 1727203998.36031: variable 'ansible_facts' from source: unknown 25675 1727203998.36438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py 25675 1727203998.37021: Sending initial data 25675 1727203998.37025: Sent initial data (156 bytes) 25675 1727203998.37731: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203998.37755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203998.37789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203998.37806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727203998.37894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203998.37917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203998.38034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203998.39705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727203998.39981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727203998.40049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpu7tp75ir /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py <<< 25675 1727203998.40053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py" <<< 25675 1727203998.40151: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpu7tp75ir" to remote "/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py" <<< 25675 1727203998.43831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203998.43844: stdout chunk (state=3): >>><<< 25675 1727203998.43858: stderr chunk (state=3): >>><<< 25675 1727203998.44099: done transferring module to remote 25675 1727203998.44103: _low_level_execute_command(): starting 25675 1727203998.44106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/ /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py && sleep 0' 25675 1727203998.45582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203998.45700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203998.45797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203998.45832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203998.45903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203998.47846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203998.47849: stdout chunk (state=3): >>><<< 25675 1727203998.47852: stderr chunk (state=3): >>><<< 25675 1727203998.47868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203998.47883: _low_level_execute_command(): starting 25675 1727203998.47895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/AnsiballZ_systemd.py && sleep 0' 25675 1727203998.48999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203998.49030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203998.49045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203998.49064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203998.49089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727203998.49370: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203998.49417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203998.49625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203998.78829: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4489216", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299418112", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "721686000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 25675 1727203998.78896: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25675 1727203998.80802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727203998.80813: stdout chunk (state=3): >>><<< 25675 1727203998.80844: stderr chunk (state=3): >>><<< 25675 1727203998.80911: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4489216", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299418112", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "721686000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727203998.81584: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727203998.81657: _low_level_execute_command(): starting 25675 1727203998.81806: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203997.4812143-26969-115780157060195/ > /dev/null 2>&1 && sleep 0' 25675 1727203998.83213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727203998.83245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203998.83323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203998.83427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203998.83632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203998.83681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203998.83907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203998.85778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203998.85782: stdout chunk (state=3): >>><<< 25675 1727203998.85785: stderr chunk (state=3): >>><<< 25675 1727203998.85882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203998.85886: handler run complete 25675 1727203998.85889: attempt loop complete, returning result 25675 1727203998.85891: _execute() done 25675 1727203998.85894: dumping result to json 25675 1727203998.85919: done dumping result, returning 25675 1727203998.85933: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-41bd-b19d-000000000025] 25675 1727203998.85942: sending task result for task 028d2410-947f-41bd-b19d-000000000025 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727203998.87691: no more pending results, returning what we have 25675 1727203998.87695: results queue empty 25675 1727203998.87696: checking for any_errors_fatal 25675 1727203998.87703: done checking for any_errors_fatal 25675 1727203998.87704: checking for max_fail_percentage 25675 1727203998.87706: done checking for max_fail_percentage 25675 1727203998.87707: checking to see if all hosts have failed and the running result is not ok 25675 1727203998.87708: done checking to see if all hosts have failed 25675 1727203998.87708: getting the remaining hosts for this loop 25675 1727203998.87710: done getting the remaining hosts for this loop 25675 1727203998.87714: getting the next task for host managed-node2 25675 1727203998.87720: done getting next task for host managed-node2 25675 1727203998.87724: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727203998.87726: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203998.87762: getting variables 25675 1727203998.87764: in VariableManager get_vars() 25675 1727203998.87890: Calling all_inventory to load vars for managed-node2 25675 1727203998.87893: Calling groups_inventory to load vars for managed-node2 25675 1727203998.87896: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203998.87906: Calling all_plugins_play to load vars for managed-node2 25675 1727203998.87909: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203998.87912: Calling groups_plugins_play to load vars for managed-node2 25675 1727203998.88685: done sending task result for task 028d2410-947f-41bd-b19d-000000000025 25675 1727203998.89268: WORKER PROCESS EXITING 25675 1727203998.91417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203998.95296: done with get_vars() 25675 1727203998.95327: done getting variables 25675 1727203998.95391: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:18 -0400 (0:00:01.826) 0:00:18.405 ***** 25675 1727203998.95430: entering _queue_task() for managed-node2/service 25675 1727203998.95791: worker is 1 (out of 1 available) 25675 1727203998.95804: exiting _queue_task() for managed-node2/service 25675 1727203998.95816: done queuing things up, now waiting for results queue to drain 25675 1727203998.95818: waiting for pending results... 25675 1727203998.96103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727203998.96209: in run() - task 028d2410-947f-41bd-b19d-000000000026 25675 1727203998.96230: variable 'ansible_search_path' from source: unknown 25675 1727203998.96235: variable 'ansible_search_path' from source: unknown 25675 1727203998.96285: calling self._execute() 25675 1727203998.96402: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203998.96409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203998.96426: variable 'omit' from source: magic vars 25675 1727203998.96899: variable 'ansible_distribution_major_version' from source: facts 25675 1727203998.96910: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203998.97055: variable 'network_provider' from source: set_fact 25675 1727203998.97081: Evaluated conditional (network_provider == "nm"): True 25675 1727203998.97183: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727203998.97287: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727203998.97496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203999.02981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203999.03256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203999.03301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203999.03457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203999.03490: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203999.03640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203999.03817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203999.03832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203999.03879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203999.03981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203999.04161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203999.04234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203999.04324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203999.04328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203999.04332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203999.04522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203999.04525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203999.04528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203999.04679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203999.04707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203999.05036: variable 'network_connections' from source: play vars 25675 1727203999.05079: variable 'interface' from source: set_fact 25675 1727203999.05399: variable 'interface' from source: set_fact 25675 1727203999.05408: variable 'interface' from source: set_fact 25675 1727203999.05521: variable 'interface' from source: set_fact 25675 1727203999.05726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727203999.06145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727203999.06299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727203999.06335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727203999.06364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727203999.06555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727203999.06582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727203999.06613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203999.06826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727203999.06830: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727203999.07189: variable 'network_connections' from source: play vars 25675 1727203999.07196: variable 'interface' from source: set_fact 25675 1727203999.07256: variable 'interface' from source: set_fact 25675 1727203999.07262: variable 'interface' from source: set_fact 25675 1727203999.07351: variable 'interface' from source: set_fact 25675 1727203999.07431: Evaluated conditional (__network_wpa_supplicant_required): False 25675 1727203999.07435: when evaluation is False, skipping this task 25675 1727203999.07438: _execute() done 25675 1727203999.07471: dumping result to json 25675 1727203999.07477: done dumping result, returning 25675 1727203999.07481: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-41bd-b19d-000000000026] 25675 1727203999.07483: sending task result for task 028d2410-947f-41bd-b19d-000000000026 25675 1727203999.07734: done sending task result for task 028d2410-947f-41bd-b19d-000000000026 25675 1727203999.07738: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25675 1727203999.07792: no more pending results, returning what we have 25675 1727203999.07795: results queue empty 25675 1727203999.07796: checking for any_errors_fatal 25675 1727203999.07816: done checking for any_errors_fatal 25675 1727203999.07817: checking for max_fail_percentage 25675 1727203999.07819: done checking for max_fail_percentage 25675 1727203999.07820: checking to see if all hosts have failed and the running result is not ok 25675 1727203999.07821: done checking to see if all hosts have failed 25675 1727203999.07821: getting the remaining hosts for this loop 25675 1727203999.07823: done getting the remaining hosts for this loop 25675 1727203999.07826: getting the next task for host managed-node2 25675 1727203999.07831: done getting next task for host managed-node2 25675 1727203999.07835: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25675 1727203999.07836: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203999.07849: getting variables 25675 1727203999.07850: in VariableManager get_vars() 25675 1727203999.07891: Calling all_inventory to load vars for managed-node2 25675 1727203999.07893: Calling groups_inventory to load vars for managed-node2 25675 1727203999.07896: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203999.07906: Calling all_plugins_play to load vars for managed-node2 25675 1727203999.08042: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203999.08047: Calling groups_plugins_play to load vars for managed-node2 25675 1727203999.10541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203999.13562: done with get_vars() 25675 1727203999.13598: done getting variables 25675 1727203999.13667: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.183) 0:00:18.588 ***** 25675 1727203999.13736: entering _queue_task() for managed-node2/service 25675 1727203999.14504: worker is 1 (out of 1 available) 25675 1727203999.14514: exiting _queue_task() for managed-node2/service 25675 1727203999.14526: done queuing things up, now waiting for results queue to drain 25675 1727203999.14527: waiting for pending results... 25675 1727203999.14902: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 25675 1727203999.14992: in run() - task 028d2410-947f-41bd-b19d-000000000027 25675 1727203999.15003: variable 'ansible_search_path' from source: unknown 25675 1727203999.15006: variable 'ansible_search_path' from source: unknown 25675 1727203999.15098: calling self._execute() 25675 1727203999.15283: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203999.15287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203999.15290: variable 'omit' from source: magic vars 25675 1727203999.15953: variable 'ansible_distribution_major_version' from source: facts 25675 1727203999.15957: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203999.16201: variable 'network_provider' from source: set_fact 25675 1727203999.16204: Evaluated conditional (network_provider == "initscripts"): False 25675 1727203999.16206: when evaluation is False, skipping this task 25675 1727203999.16208: _execute() done 25675 1727203999.16210: dumping result to json 25675 1727203999.16212: done dumping result, returning 25675 1727203999.16214: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-41bd-b19d-000000000027] 25675 1727203999.16215: sending task result for task 028d2410-947f-41bd-b19d-000000000027 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727203999.16326: no more pending results, returning what we have 25675 1727203999.16331: results queue empty 25675 1727203999.16332: checking for any_errors_fatal 25675 1727203999.16343: done checking for any_errors_fatal 25675 1727203999.16344: checking for max_fail_percentage 25675 1727203999.16346: done checking for max_fail_percentage 25675 1727203999.16347: checking to see if all hosts have failed and the running result is not ok 25675 1727203999.16348: done checking to see if all hosts have failed 25675 1727203999.16349: getting the remaining hosts for this loop 25675 1727203999.16350: done getting the remaining hosts for this loop 25675 1727203999.16354: getting the next task for host managed-node2 25675 1727203999.16362: done getting next task for host managed-node2 25675 1727203999.16366: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727203999.16370: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203999.16488: done sending task result for task 028d2410-947f-41bd-b19d-000000000027 25675 1727203999.16498: getting variables 25675 1727203999.16500: in VariableManager get_vars() 25675 1727203999.16536: Calling all_inventory to load vars for managed-node2 25675 1727203999.16539: Calling groups_inventory to load vars for managed-node2 25675 1727203999.16541: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203999.16552: Calling all_plugins_play to load vars for managed-node2 25675 1727203999.16555: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203999.16558: Calling groups_plugins_play to load vars for managed-node2 25675 1727203999.17083: WORKER PROCESS EXITING 25675 1727203999.19018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203999.22190: done with get_vars() 25675 1727203999.22217: done getting variables 25675 1727203999.22288: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.085) 0:00:18.674 ***** 25675 1727203999.22317: entering _queue_task() for managed-node2/copy 25675 1727203999.22805: worker is 1 (out of 1 available) 25675 1727203999.22817: exiting _queue_task() for managed-node2/copy 25675 1727203999.22828: done queuing things up, now waiting for results queue to drain 25675 1727203999.22830: waiting for pending results... 25675 1727203999.23194: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727203999.23235: in run() - task 028d2410-947f-41bd-b19d-000000000028 25675 1727203999.23307: variable 'ansible_search_path' from source: unknown 25675 1727203999.23314: variable 'ansible_search_path' from source: unknown 25675 1727203999.23356: calling self._execute() 25675 1727203999.23503: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203999.23519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203999.23532: variable 'omit' from source: magic vars 25675 1727203999.24061: variable 'ansible_distribution_major_version' from source: facts 25675 1727203999.24064: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203999.24187: variable 'network_provider' from source: set_fact 25675 1727203999.24199: Evaluated conditional (network_provider == "initscripts"): False 25675 1727203999.24207: when evaluation is False, skipping this task 25675 1727203999.24214: _execute() done 25675 1727203999.24221: dumping result to json 25675 1727203999.24230: done dumping result, returning 25675 1727203999.24242: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-41bd-b19d-000000000028] 25675 1727203999.24252: sending task result for task 028d2410-947f-41bd-b19d-000000000028 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25675 1727203999.24525: no more pending results, returning what we have 25675 1727203999.24530: results queue empty 25675 1727203999.24531: checking for any_errors_fatal 25675 1727203999.24536: done checking for any_errors_fatal 25675 1727203999.24537: checking for max_fail_percentage 25675 1727203999.24539: done checking for max_fail_percentage 25675 1727203999.24540: checking to see if all hosts have failed and the running result is not ok 25675 1727203999.24541: done checking to see if all hosts have failed 25675 1727203999.24542: getting the remaining hosts for this loop 25675 1727203999.24543: done getting the remaining hosts for this loop 25675 1727203999.24547: getting the next task for host managed-node2 25675 1727203999.24555: done getting next task for host managed-node2 25675 1727203999.24559: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727203999.24561: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727203999.24583: getting variables 25675 1727203999.24585: in VariableManager get_vars() 25675 1727203999.24628: Calling all_inventory to load vars for managed-node2 25675 1727203999.24631: Calling groups_inventory to load vars for managed-node2 25675 1727203999.24633: Calling all_plugins_inventory to load vars for managed-node2 25675 1727203999.24644: Calling all_plugins_play to load vars for managed-node2 25675 1727203999.24647: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727203999.24650: Calling groups_plugins_play to load vars for managed-node2 25675 1727203999.25350: done sending task result for task 028d2410-947f-41bd-b19d-000000000028 25675 1727203999.25353: WORKER PROCESS EXITING 25675 1727203999.27970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727203999.31007: done with get_vars() 25675 1727203999.31034: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.089) 0:00:18.763 ***** 25675 1727203999.31222: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727203999.31224: Creating lock for fedora.linux_system_roles.network_connections 25675 1727203999.31746: worker is 1 (out of 1 available) 25675 1727203999.31757: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727203999.31770: done queuing things up, now waiting for results queue to drain 25675 1727203999.31772: waiting for pending results... 25675 1727203999.32348: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727203999.32465: in run() - task 028d2410-947f-41bd-b19d-000000000029 25675 1727203999.32495: variable 'ansible_search_path' from source: unknown 25675 1727203999.32505: variable 'ansible_search_path' from source: unknown 25675 1727203999.32548: calling self._execute() 25675 1727203999.32649: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203999.32661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203999.32680: variable 'omit' from source: magic vars 25675 1727203999.33064: variable 'ansible_distribution_major_version' from source: facts 25675 1727203999.33088: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727203999.33100: variable 'omit' from source: magic vars 25675 1727203999.33147: variable 'omit' from source: magic vars 25675 1727203999.33303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727203999.37254: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727203999.37257: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727203999.37259: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727203999.37582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727203999.37585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727203999.37615: variable 'network_provider' from source: set_fact 25675 1727203999.37928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727203999.38656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727203999.38809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727203999.38852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727203999.38907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727203999.38991: variable 'omit' from source: magic vars 25675 1727203999.39329: variable 'omit' from source: magic vars 25675 1727203999.39650: variable 'network_connections' from source: play vars 25675 1727203999.39653: variable 'interface' from source: set_fact 25675 1727203999.39701: variable 'interface' from source: set_fact 25675 1727203999.39768: variable 'interface' from source: set_fact 25675 1727203999.39835: variable 'interface' from source: set_fact 25675 1727203999.40237: variable 'omit' from source: magic vars 25675 1727203999.40251: variable '__lsr_ansible_managed' from source: task vars 25675 1727203999.40326: variable '__lsr_ansible_managed' from source: task vars 25675 1727203999.40864: Loaded config def from plugin (lookup/template) 25675 1727203999.40982: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25675 1727203999.41011: File lookup term: get_ansible_managed.j2 25675 1727203999.41018: variable 'ansible_search_path' from source: unknown 25675 1727203999.41025: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25675 1727203999.41040: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25675 1727203999.41058: variable 'ansible_search_path' from source: unknown 25675 1727203999.55637: variable 'ansible_managed' from source: unknown 25675 1727203999.56325: variable 'omit' from source: magic vars 25675 1727203999.56352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727203999.56380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727203999.56490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727203999.56493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203999.56495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727203999.56510: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727203999.56514: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203999.56516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203999.56841: Set connection var ansible_shell_type to sh 25675 1727203999.56964: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727203999.56969: Set connection var ansible_timeout to 10 25675 1727203999.56978: Set connection var ansible_pipelining to False 25675 1727203999.56988: Set connection var ansible_shell_executable to /bin/sh 25675 1727203999.56991: Set connection var ansible_connection to ssh 25675 1727203999.57024: variable 'ansible_shell_executable' from source: unknown 25675 1727203999.57031: variable 'ansible_connection' from source: unknown 25675 1727203999.57033: variable 'ansible_module_compression' from source: unknown 25675 1727203999.57036: variable 'ansible_shell_type' from source: unknown 25675 1727203999.57038: variable 'ansible_shell_executable' from source: unknown 25675 1727203999.57040: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727203999.57042: variable 'ansible_pipelining' from source: unknown 25675 1727203999.57044: variable 'ansible_timeout' from source: unknown 25675 1727203999.57046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727203999.57399: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727203999.57411: variable 'omit' from source: magic vars 25675 1727203999.57466: starting attempt loop 25675 1727203999.57470: running the handler 25675 1727203999.57476: _low_level_execute_command(): starting 25675 1727203999.57479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727203999.59372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203999.59383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203999.59385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203999.59387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203999.59389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727203999.59391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203999.59792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203999.59900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203999.61983: stdout chunk (state=3): >>>/root <<< 25675 1727203999.61987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203999.61989: stdout chunk (state=3): >>><<< 25675 1727203999.61991: stderr chunk (state=3): >>><<< 25675 1727203999.61993: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203999.61996: _low_level_execute_command(): starting 25675 1727203999.61998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846 `" && echo ansible-tmp-1727203999.619252-27212-46235089171846="` echo /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846 `" ) && sleep 0' 25675 1727203999.63098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727203999.63102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727203999.63191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727203999.63194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203999.63208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727203999.63213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727203999.63227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727203999.63232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727203999.63489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727203999.63511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727203999.63609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727203999.65750: stdout chunk (state=3): >>>ansible-tmp-1727203999.619252-27212-46235089171846=/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846 <<< 25675 1727203999.65754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727203999.65804: stderr chunk (state=3): >>><<< 25675 1727203999.65807: stdout chunk (state=3): >>><<< 25675 1727203999.65886: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203999.619252-27212-46235089171846=/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727203999.65890: variable 'ansible_module_compression' from source: unknown 25675 1727203999.66022: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 25675 1727203999.66026: ANSIBALLZ: Acquiring lock 25675 1727203999.66028: ANSIBALLZ: Lock acquired: 139822501924976 25675 1727203999.66044: ANSIBALLZ: Creating module 25675 1727204000.02983: ANSIBALLZ: Writing module into payload 25675 1727204000.03290: ANSIBALLZ: Writing module 25675 1727204000.03313: ANSIBALLZ: Renaming module 25675 1727204000.03316: ANSIBALLZ: Done creating module 25675 1727204000.03342: variable 'ansible_facts' from source: unknown 25675 1727204000.03548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py 25675 1727204000.03599: Sending initial data 25675 1727204000.03602: Sent initial data (166 bytes) 25675 1727204000.04564: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204000.04791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204000.04829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204000.04833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204000.04982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204000.06582: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204000.06680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204000.07119: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpp7g5q71_ /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py <<< 25675 1727204000.07122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py" <<< 25675 1727204000.07125: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpp7g5q71_" to remote "/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py" <<< 25675 1727204000.09959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204000.09963: stdout chunk (state=3): >>><<< 25675 1727204000.09966: stderr chunk (state=3): >>><<< 25675 1727204000.10108: done transferring module to remote 25675 1727204000.10211: _low_level_execute_command(): starting 25675 1727204000.10214: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/ /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py && sleep 0' 25675 1727204000.11697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204000.11798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204000.11969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204000.12330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204000.12334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204000.14163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204000.14224: stderr chunk (state=3): >>><<< 25675 1727204000.14230: stdout chunk (state=3): >>><<< 25675 1727204000.14325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204000.14446: _low_level_execute_command(): starting 25675 1727204000.14457: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/AnsiballZ_network_connections.py && sleep 0' 25675 1727204000.16100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204000.16181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204000.16583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204000.16592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204000.16594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204000.16910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204000.58095: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25675 1727204000.59971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204000.59980: stdout chunk (state=3): >>><<< 25675 1727204000.59982: stderr chunk (state=3): >>><<< 25675 1727204000.60000: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204000.60041: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204000.60050: _low_level_execute_command(): starting 25675 1727204000.60055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203999.619252-27212-46235089171846/ > /dev/null 2>&1 && sleep 0' 25675 1727204000.60680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204000.60683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204000.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204000.60704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204000.60706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204000.60708: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204000.60710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204000.60815: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204000.60992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204000.61167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204000.63131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204000.63135: stdout chunk (state=3): >>><<< 25675 1727204000.63141: stderr chunk (state=3): >>><<< 25675 1727204000.63159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204000.63166: handler run complete 25675 1727204000.63228: attempt loop complete, returning result 25675 1727204000.63231: _execute() done 25675 1727204000.63234: dumping result to json 25675 1727204000.63236: done dumping result, returning 25675 1727204000.63238: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-41bd-b19d-000000000029] 25675 1727204000.63240: sending task result for task 028d2410-947f-41bd-b19d-000000000029 25675 1727204000.63358: done sending task result for task 028d2410-947f-41bd-b19d-000000000029 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active) 25675 1727204000.63697: no more pending results, returning what we have 25675 1727204000.63701: results queue empty 25675 1727204000.63702: checking for any_errors_fatal 25675 1727204000.63712: done checking for any_errors_fatal 25675 1727204000.63713: checking for max_fail_percentage 25675 1727204000.63715: done checking for max_fail_percentage 25675 1727204000.63716: checking to see if all hosts have failed and the running result is not ok 25675 1727204000.63717: done checking to see if all hosts have failed 25675 1727204000.63717: getting the remaining hosts for this loop 25675 1727204000.63719: done getting the remaining hosts for this loop 25675 1727204000.63722: getting the next task for host managed-node2 25675 1727204000.63728: done getting next task for host managed-node2 25675 1727204000.63732: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204000.63734: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204000.63744: getting variables 25675 1727204000.63746: in VariableManager get_vars() 25675 1727204000.63804: Calling all_inventory to load vars for managed-node2 25675 1727204000.63807: Calling groups_inventory to load vars for managed-node2 25675 1727204000.63810: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204000.63817: WORKER PROCESS EXITING 25675 1727204000.63827: Calling all_plugins_play to load vars for managed-node2 25675 1727204000.63829: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204000.63833: Calling groups_plugins_play to load vars for managed-node2 25675 1727204000.67499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204000.70091: done with get_vars() 25675 1727204000.70129: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:20 -0400 (0:00:01.390) 0:00:20.153 ***** 25675 1727204000.70229: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204000.70231: Creating lock for fedora.linux_system_roles.network_state 25675 1727204000.70616: worker is 1 (out of 1 available) 25675 1727204000.70629: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204000.70740: done queuing things up, now waiting for results queue to drain 25675 1727204000.70742: waiting for pending results... 25675 1727204000.70971: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204000.71113: in run() - task 028d2410-947f-41bd-b19d-00000000002a 25675 1727204000.71135: variable 'ansible_search_path' from source: unknown 25675 1727204000.71144: variable 'ansible_search_path' from source: unknown 25675 1727204000.71190: calling self._execute() 25675 1727204000.71318: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.71322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.71428: variable 'omit' from source: magic vars 25675 1727204000.71767: variable 'ansible_distribution_major_version' from source: facts 25675 1727204000.71788: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204000.71924: variable 'network_state' from source: role '' defaults 25675 1727204000.71941: Evaluated conditional (network_state != {}): False 25675 1727204000.71948: when evaluation is False, skipping this task 25675 1727204000.71957: _execute() done 25675 1727204000.71983: dumping result to json 25675 1727204000.71992: done dumping result, returning 25675 1727204000.72003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-41bd-b19d-00000000002a] 25675 1727204000.72013: sending task result for task 028d2410-947f-41bd-b19d-00000000002a 25675 1727204000.72225: done sending task result for task 028d2410-947f-41bd-b19d-00000000002a 25675 1727204000.72228: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204000.72293: no more pending results, returning what we have 25675 1727204000.72298: results queue empty 25675 1727204000.72298: checking for any_errors_fatal 25675 1727204000.72310: done checking for any_errors_fatal 25675 1727204000.72311: checking for max_fail_percentage 25675 1727204000.72313: done checking for max_fail_percentage 25675 1727204000.72315: checking to see if all hosts have failed and the running result is not ok 25675 1727204000.72316: done checking to see if all hosts have failed 25675 1727204000.72317: getting the remaining hosts for this loop 25675 1727204000.72318: done getting the remaining hosts for this loop 25675 1727204000.72322: getting the next task for host managed-node2 25675 1727204000.72329: done getting next task for host managed-node2 25675 1727204000.72333: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204000.72338: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204000.72360: getting variables 25675 1727204000.72362: in VariableManager get_vars() 25675 1727204000.72405: Calling all_inventory to load vars for managed-node2 25675 1727204000.72410: Calling groups_inventory to load vars for managed-node2 25675 1727204000.72413: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204000.72424: Calling all_plugins_play to load vars for managed-node2 25675 1727204000.72428: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204000.72431: Calling groups_plugins_play to load vars for managed-node2 25675 1727204000.74118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204000.75814: done with get_vars() 25675 1727204000.75852: done getting variables 25675 1727204000.75916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.057) 0:00:20.210 ***** 25675 1727204000.75957: entering _queue_task() for managed-node2/debug 25675 1727204000.76507: worker is 1 (out of 1 available) 25675 1727204000.76517: exiting _queue_task() for managed-node2/debug 25675 1727204000.76527: done queuing things up, now waiting for results queue to drain 25675 1727204000.76528: waiting for pending results... 25675 1727204000.77097: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204000.77102: in run() - task 028d2410-947f-41bd-b19d-00000000002b 25675 1727204000.77104: variable 'ansible_search_path' from source: unknown 25675 1727204000.77106: variable 'ansible_search_path' from source: unknown 25675 1727204000.77109: calling self._execute() 25675 1727204000.77265: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.77521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.77525: variable 'omit' from source: magic vars 25675 1727204000.78220: variable 'ansible_distribution_major_version' from source: facts 25675 1727204000.78238: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204000.78251: variable 'omit' from source: magic vars 25675 1727204000.78331: variable 'omit' from source: magic vars 25675 1727204000.78430: variable 'omit' from source: magic vars 25675 1727204000.78547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204000.78594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204000.78855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204000.78858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.78860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.78862: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204000.78864: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.78866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.79001: Set connection var ansible_shell_type to sh 25675 1727204000.79012: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204000.79026: Set connection var ansible_timeout to 10 25675 1727204000.79035: Set connection var ansible_pipelining to False 25675 1727204000.79044: Set connection var ansible_shell_executable to /bin/sh 25675 1727204000.79050: Set connection var ansible_connection to ssh 25675 1727204000.79097: variable 'ansible_shell_executable' from source: unknown 25675 1727204000.79104: variable 'ansible_connection' from source: unknown 25675 1727204000.79111: variable 'ansible_module_compression' from source: unknown 25675 1727204000.79118: variable 'ansible_shell_type' from source: unknown 25675 1727204000.79129: variable 'ansible_shell_executable' from source: unknown 25675 1727204000.79136: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.79144: variable 'ansible_pipelining' from source: unknown 25675 1727204000.79150: variable 'ansible_timeout' from source: unknown 25675 1727204000.79157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.79330: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204000.79352: variable 'omit' from source: magic vars 25675 1727204000.79362: starting attempt loop 25675 1727204000.79369: running the handler 25675 1727204000.79539: variable '__network_connections_result' from source: set_fact 25675 1727204000.79603: handler run complete 25675 1727204000.79678: attempt loop complete, returning result 25675 1727204000.79682: _execute() done 25675 1727204000.79684: dumping result to json 25675 1727204000.79687: done dumping result, returning 25675 1727204000.79689: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-41bd-b19d-00000000002b] 25675 1727204000.79691: sending task result for task 028d2410-947f-41bd-b19d-00000000002b ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active)" ] } 25675 1727204000.80011: no more pending results, returning what we have 25675 1727204000.80015: results queue empty 25675 1727204000.80016: checking for any_errors_fatal 25675 1727204000.80024: done checking for any_errors_fatal 25675 1727204000.80025: checking for max_fail_percentage 25675 1727204000.80027: done checking for max_fail_percentage 25675 1727204000.80028: checking to see if all hosts have failed and the running result is not ok 25675 1727204000.80029: done checking to see if all hosts have failed 25675 1727204000.80030: getting the remaining hosts for this loop 25675 1727204000.80031: done getting the remaining hosts for this loop 25675 1727204000.80035: getting the next task for host managed-node2 25675 1727204000.80041: done getting next task for host managed-node2 25675 1727204000.80046: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204000.80048: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204000.80059: getting variables 25675 1727204000.80061: in VariableManager get_vars() 25675 1727204000.80104: Calling all_inventory to load vars for managed-node2 25675 1727204000.80107: Calling groups_inventory to load vars for managed-node2 25675 1727204000.80110: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204000.80121: Calling all_plugins_play to load vars for managed-node2 25675 1727204000.80124: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204000.80127: Calling groups_plugins_play to load vars for managed-node2 25675 1727204000.80692: done sending task result for task 028d2410-947f-41bd-b19d-00000000002b 25675 1727204000.80696: WORKER PROCESS EXITING 25675 1727204000.81965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204000.83766: done with get_vars() 25675 1727204000.83831: done getting variables 25675 1727204000.84017: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.080) 0:00:20.291 ***** 25675 1727204000.84047: entering _queue_task() for managed-node2/debug 25675 1727204000.84731: worker is 1 (out of 1 available) 25675 1727204000.84744: exiting _queue_task() for managed-node2/debug 25675 1727204000.84877: done queuing things up, now waiting for results queue to drain 25675 1727204000.84879: waiting for pending results... 25675 1727204000.85179: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204000.85312: in run() - task 028d2410-947f-41bd-b19d-00000000002c 25675 1727204000.85317: variable 'ansible_search_path' from source: unknown 25675 1727204000.85320: variable 'ansible_search_path' from source: unknown 25675 1727204000.85600: calling self._execute() 25675 1727204000.85690: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.85694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.85726: variable 'omit' from source: magic vars 25675 1727204000.86185: variable 'ansible_distribution_major_version' from source: facts 25675 1727204000.86190: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204000.86197: variable 'omit' from source: magic vars 25675 1727204000.86199: variable 'omit' from source: magic vars 25675 1727204000.86305: variable 'omit' from source: magic vars 25675 1727204000.86308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204000.86311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204000.86313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204000.86337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.86352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.86392: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204000.86401: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.86411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.86539: Set connection var ansible_shell_type to sh 25675 1727204000.86553: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204000.86636: Set connection var ansible_timeout to 10 25675 1727204000.86640: Set connection var ansible_pipelining to False 25675 1727204000.86643: Set connection var ansible_shell_executable to /bin/sh 25675 1727204000.86645: Set connection var ansible_connection to ssh 25675 1727204000.86647: variable 'ansible_shell_executable' from source: unknown 25675 1727204000.86649: variable 'ansible_connection' from source: unknown 25675 1727204000.86651: variable 'ansible_module_compression' from source: unknown 25675 1727204000.86653: variable 'ansible_shell_type' from source: unknown 25675 1727204000.86655: variable 'ansible_shell_executable' from source: unknown 25675 1727204000.86657: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.86659: variable 'ansible_pipelining' from source: unknown 25675 1727204000.86661: variable 'ansible_timeout' from source: unknown 25675 1727204000.86663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.86894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204000.86898: variable 'omit' from source: magic vars 25675 1727204000.86901: starting attempt loop 25675 1727204000.86903: running the handler 25675 1727204000.86919: variable '__network_connections_result' from source: set_fact 25675 1727204000.87018: variable '__network_connections_result' from source: set_fact 25675 1727204000.87182: handler run complete 25675 1727204000.87189: attempt loop complete, returning result 25675 1727204000.87197: _execute() done 25675 1727204000.87204: dumping result to json 25675 1727204000.87221: done dumping result, returning 25675 1727204000.87235: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-41bd-b19d-00000000002c] 25675 1727204000.87245: sending task result for task 028d2410-947f-41bd-b19d-00000000002c 25675 1727204000.87624: done sending task result for task 028d2410-947f-41bd-b19d-00000000002c 25675 1727204000.87628: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 2337de5b-b8f2-42c8-892f-a64413dea3ee (not-active)" ] } } 25675 1727204000.87985: no more pending results, returning what we have 25675 1727204000.87988: results queue empty 25675 1727204000.87989: checking for any_errors_fatal 25675 1727204000.87995: done checking for any_errors_fatal 25675 1727204000.87996: checking for max_fail_percentage 25675 1727204000.87998: done checking for max_fail_percentage 25675 1727204000.87999: checking to see if all hosts have failed and the running result is not ok 25675 1727204000.88000: done checking to see if all hosts have failed 25675 1727204000.88000: getting the remaining hosts for this loop 25675 1727204000.88002: done getting the remaining hosts for this loop 25675 1727204000.88006: getting the next task for host managed-node2 25675 1727204000.88012: done getting next task for host managed-node2 25675 1727204000.88015: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204000.88017: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204000.88028: getting variables 25675 1727204000.88030: in VariableManager get_vars() 25675 1727204000.88067: Calling all_inventory to load vars for managed-node2 25675 1727204000.88070: Calling groups_inventory to load vars for managed-node2 25675 1727204000.88183: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204000.88197: Calling all_plugins_play to load vars for managed-node2 25675 1727204000.88201: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204000.88204: Calling groups_plugins_play to load vars for managed-node2 25675 1727204000.90595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204000.92616: done with get_vars() 25675 1727204000.92641: done getting variables 25675 1727204000.92714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.086) 0:00:20.378 ***** 25675 1727204000.92746: entering _queue_task() for managed-node2/debug 25675 1727204000.93101: worker is 1 (out of 1 available) 25675 1727204000.93114: exiting _queue_task() for managed-node2/debug 25675 1727204000.93237: done queuing things up, now waiting for results queue to drain 25675 1727204000.93239: waiting for pending results... 25675 1727204000.93569: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204000.93579: in run() - task 028d2410-947f-41bd-b19d-00000000002d 25675 1727204000.93670: variable 'ansible_search_path' from source: unknown 25675 1727204000.93683: variable 'ansible_search_path' from source: unknown 25675 1727204000.93727: calling self._execute() 25675 1727204000.93884: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.93889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.93893: variable 'omit' from source: magic vars 25675 1727204000.94283: variable 'ansible_distribution_major_version' from source: facts 25675 1727204000.94301: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204000.94434: variable 'network_state' from source: role '' defaults 25675 1727204000.94450: Evaluated conditional (network_state != {}): False 25675 1727204000.94458: when evaluation is False, skipping this task 25675 1727204000.94466: _execute() done 25675 1727204000.94483: dumping result to json 25675 1727204000.94534: done dumping result, returning 25675 1727204000.94538: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-41bd-b19d-00000000002d] 25675 1727204000.94541: sending task result for task 028d2410-947f-41bd-b19d-00000000002d 25675 1727204000.94626: done sending task result for task 028d2410-947f-41bd-b19d-00000000002d 25675 1727204000.94629: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 25675 1727204000.94724: no more pending results, returning what we have 25675 1727204000.94729: results queue empty 25675 1727204000.94729: checking for any_errors_fatal 25675 1727204000.94739: done checking for any_errors_fatal 25675 1727204000.94740: checking for max_fail_percentage 25675 1727204000.94741: done checking for max_fail_percentage 25675 1727204000.94742: checking to see if all hosts have failed and the running result is not ok 25675 1727204000.94743: done checking to see if all hosts have failed 25675 1727204000.94744: getting the remaining hosts for this loop 25675 1727204000.94858: done getting the remaining hosts for this loop 25675 1727204000.94863: getting the next task for host managed-node2 25675 1727204000.94870: done getting next task for host managed-node2 25675 1727204000.94879: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204000.94882: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204000.94898: getting variables 25675 1727204000.94900: in VariableManager get_vars() 25675 1727204000.94940: Calling all_inventory to load vars for managed-node2 25675 1727204000.94943: Calling groups_inventory to load vars for managed-node2 25675 1727204000.94946: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204000.94959: Calling all_plugins_play to load vars for managed-node2 25675 1727204000.94962: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204000.94965: Calling groups_plugins_play to load vars for managed-node2 25675 1727204000.96467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204000.98044: done with get_vars() 25675 1727204000.98067: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.054) 0:00:20.432 ***** 25675 1727204000.98166: entering _queue_task() for managed-node2/ping 25675 1727204000.98168: Creating lock for ping 25675 1727204000.98514: worker is 1 (out of 1 available) 25675 1727204000.98527: exiting _queue_task() for managed-node2/ping 25675 1727204000.98539: done queuing things up, now waiting for results queue to drain 25675 1727204000.98540: waiting for pending results... 25675 1727204000.98909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204000.98952: in run() - task 028d2410-947f-41bd-b19d-00000000002e 25675 1727204000.99004: variable 'ansible_search_path' from source: unknown 25675 1727204000.99007: variable 'ansible_search_path' from source: unknown 25675 1727204000.99029: calling self._execute() 25675 1727204000.99127: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.99181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204000.99185: variable 'omit' from source: magic vars 25675 1727204000.99585: variable 'ansible_distribution_major_version' from source: facts 25675 1727204000.99603: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204000.99615: variable 'omit' from source: magic vars 25675 1727204000.99670: variable 'omit' from source: magic vars 25675 1727204000.99714: variable 'omit' from source: magic vars 25675 1727204000.99768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204000.99880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204000.99884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204000.99886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.99889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204000.99927: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204000.99938: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204000.99947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.00061: Set connection var ansible_shell_type to sh 25675 1727204001.00078: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204001.00110: Set connection var ansible_timeout to 10 25675 1727204001.00130: Set connection var ansible_pipelining to False 25675 1727204001.00180: Set connection var ansible_shell_executable to /bin/sh 25675 1727204001.00183: Set connection var ansible_connection to ssh 25675 1727204001.00189: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.00197: variable 'ansible_connection' from source: unknown 25675 1727204001.00215: variable 'ansible_module_compression' from source: unknown 25675 1727204001.00222: variable 'ansible_shell_type' from source: unknown 25675 1727204001.00228: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.00236: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.00243: variable 'ansible_pipelining' from source: unknown 25675 1727204001.00322: variable 'ansible_timeout' from source: unknown 25675 1727204001.00326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.00522: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204001.00556: variable 'omit' from source: magic vars 25675 1727204001.00567: starting attempt loop 25675 1727204001.00579: running the handler 25675 1727204001.00608: _low_level_execute_command(): starting 25675 1727204001.00626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204001.01688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.01704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.01794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204001.01863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.01969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.03704: stdout chunk (state=3): >>>/root <<< 25675 1727204001.03863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204001.03874: stdout chunk (state=3): >>><<< 25675 1727204001.03879: stderr chunk (state=3): >>><<< 25675 1727204001.04002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204001.04231: _low_level_execute_command(): starting 25675 1727204001.04235: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636 `" && echo ansible-tmp-1727204001.03961-27359-241269224977636="` echo /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636 `" ) && sleep 0' 25675 1727204001.05068: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204001.05072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204001.05080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204001.05083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204001.05085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204001.05087: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204001.05089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.05097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204001.05099: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204001.05101: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204001.05103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204001.05105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204001.05107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204001.05109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204001.05111: stderr chunk (state=3): >>>debug2: match found <<< 25675 1727204001.05113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.05155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204001.05158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204001.05181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.05285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.07294: stdout chunk (state=3): >>>ansible-tmp-1727204001.03961-27359-241269224977636=/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636 <<< 25675 1727204001.07451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204001.07454: stdout chunk (state=3): >>><<< 25675 1727204001.07456: stderr chunk (state=3): >>><<< 25675 1727204001.07681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204001.03961-27359-241269224977636=/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204001.07685: variable 'ansible_module_compression' from source: unknown 25675 1727204001.07687: ANSIBALLZ: Using lock for ping 25675 1727204001.07689: ANSIBALLZ: Acquiring lock 25675 1727204001.07691: ANSIBALLZ: Lock acquired: 139822501921376 25675 1727204001.07692: ANSIBALLZ: Creating module 25675 1727204001.26168: ANSIBALLZ: Writing module into payload 25675 1727204001.26312: ANSIBALLZ: Writing module 25675 1727204001.26401: ANSIBALLZ: Renaming module 25675 1727204001.26451: ANSIBALLZ: Done creating module 25675 1727204001.26481: variable 'ansible_facts' from source: unknown 25675 1727204001.26588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py 25675 1727204001.26866: Sending initial data 25675 1727204001.26886: Sent initial data (151 bytes) 25675 1727204001.27959: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.27998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204001.28023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204001.28047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.28497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.30121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204001.30217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204001.30338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5jk_8udj /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py <<< 25675 1727204001.30341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py" <<< 25675 1727204001.30419: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5jk_8udj" to remote "/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py" <<< 25675 1727204001.31449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204001.31452: stderr chunk (state=3): >>><<< 25675 1727204001.31455: stdout chunk (state=3): >>><<< 25675 1727204001.31457: done transferring module to remote 25675 1727204001.31459: _low_level_execute_command(): starting 25675 1727204001.31461: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/ /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py && sleep 0' 25675 1727204001.31991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.32192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204001.32231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.32787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.34618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204001.34666: stderr chunk (state=3): >>><<< 25675 1727204001.34723: stdout chunk (state=3): >>><<< 25675 1727204001.34744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204001.34747: _low_level_execute_command(): starting 25675 1727204001.34750: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/AnsiballZ_ping.py && sleep 0' 25675 1727204001.36267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204001.36271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204001.36294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204001.36297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204001.36357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.36448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204001.36487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.36578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.51653: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25675 1727204001.52939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204001.53195: stderr chunk (state=3): >>><<< 25675 1727204001.53199: stdout chunk (state=3): >>><<< 25675 1727204001.53217: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204001.53243: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204001.53252: _low_level_execute_command(): starting 25675 1727204001.53257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204001.03961-27359-241269224977636/ > /dev/null 2>&1 && sleep 0' 25675 1727204001.54492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204001.54591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204001.54693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204001.54741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204001.54746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204001.54833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204001.56986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204001.56989: stdout chunk (state=3): >>><<< 25675 1727204001.56992: stderr chunk (state=3): >>><<< 25675 1727204001.56994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204001.57001: handler run complete 25675 1727204001.57003: attempt loop complete, returning result 25675 1727204001.57005: _execute() done 25675 1727204001.57007: dumping result to json 25675 1727204001.57009: done dumping result, returning 25675 1727204001.57011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-41bd-b19d-00000000002e] 25675 1727204001.57013: sending task result for task 028d2410-947f-41bd-b19d-00000000002e 25675 1727204001.57082: done sending task result for task 028d2410-947f-41bd-b19d-00000000002e 25675 1727204001.57379: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 25675 1727204001.57441: no more pending results, returning what we have 25675 1727204001.57444: results queue empty 25675 1727204001.57445: checking for any_errors_fatal 25675 1727204001.57450: done checking for any_errors_fatal 25675 1727204001.57451: checking for max_fail_percentage 25675 1727204001.57452: done checking for max_fail_percentage 25675 1727204001.57453: checking to see if all hosts have failed and the running result is not ok 25675 1727204001.57454: done checking to see if all hosts have failed 25675 1727204001.57455: getting the remaining hosts for this loop 25675 1727204001.57456: done getting the remaining hosts for this loop 25675 1727204001.57459: getting the next task for host managed-node2 25675 1727204001.57468: done getting next task for host managed-node2 25675 1727204001.57470: ^ task is: TASK: meta (role_complete) 25675 1727204001.57476: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204001.57487: getting variables 25675 1727204001.57489: in VariableManager get_vars() 25675 1727204001.57526: Calling all_inventory to load vars for managed-node2 25675 1727204001.57528: Calling groups_inventory to load vars for managed-node2 25675 1727204001.57530: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.57539: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.57541: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.57544: Calling groups_plugins_play to load vars for managed-node2 25675 1727204001.59903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204001.62635: done with get_vars() 25675 1727204001.62667: done getting variables 25675 1727204001.62777: done queuing things up, now waiting for results queue to drain 25675 1727204001.62779: results queue empty 25675 1727204001.62780: checking for any_errors_fatal 25675 1727204001.62783: done checking for any_errors_fatal 25675 1727204001.62784: checking for max_fail_percentage 25675 1727204001.62785: done checking for max_fail_percentage 25675 1727204001.62786: checking to see if all hosts have failed and the running result is not ok 25675 1727204001.62787: done checking to see if all hosts have failed 25675 1727204001.62787: getting the remaining hosts for this loop 25675 1727204001.62788: done getting the remaining hosts for this loop 25675 1727204001.62791: getting the next task for host managed-node2 25675 1727204001.62795: done getting next task for host managed-node2 25675 1727204001.62797: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 25675 1727204001.62807: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204001.62810: getting variables 25675 1727204001.62811: in VariableManager get_vars() 25675 1727204001.62829: Calling all_inventory to load vars for managed-node2 25675 1727204001.62851: Calling groups_inventory to load vars for managed-node2 25675 1727204001.62855: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.62860: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.62862: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.62865: Calling groups_plugins_play to load vars for managed-node2 25675 1727204001.64356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204001.67269: done with get_vars() 25675 1727204001.67316: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Tuesday 24 September 2024 14:53:21 -0400 (0:00:00.692) 0:00:21.125 ***** 25675 1727204001.67408: entering _queue_task() for managed-node2/include_tasks 25675 1727204001.67915: worker is 1 (out of 1 available) 25675 1727204001.67926: exiting _queue_task() for managed-node2/include_tasks 25675 1727204001.67937: done queuing things up, now waiting for results queue to drain 25675 1727204001.67939: waiting for pending results... 25675 1727204001.68144: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 25675 1727204001.68270: in run() - task 028d2410-947f-41bd-b19d-000000000030 25675 1727204001.68300: variable 'ansible_search_path' from source: unknown 25675 1727204001.68347: calling self._execute() 25675 1727204001.68482: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.68485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.68498: variable 'omit' from source: magic vars 25675 1727204001.68924: variable 'ansible_distribution_major_version' from source: facts 25675 1727204001.68980: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204001.68984: _execute() done 25675 1727204001.68986: dumping result to json 25675 1727204001.68989: done dumping result, returning 25675 1727204001.68992: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [028d2410-947f-41bd-b19d-000000000030] 25675 1727204001.68996: sending task result for task 028d2410-947f-41bd-b19d-000000000030 25675 1727204001.69237: no more pending results, returning what we have 25675 1727204001.69243: in VariableManager get_vars() 25675 1727204001.69297: Calling all_inventory to load vars for managed-node2 25675 1727204001.69481: Calling groups_inventory to load vars for managed-node2 25675 1727204001.69485: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.69497: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.69500: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.69503: Calling groups_plugins_play to load vars for managed-node2 25675 1727204001.70199: done sending task result for task 028d2410-947f-41bd-b19d-000000000030 25675 1727204001.70202: WORKER PROCESS EXITING 25675 1727204001.72235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204001.73923: done with get_vars() 25675 1727204001.73946: variable 'ansible_search_path' from source: unknown 25675 1727204001.73962: we have included files to process 25675 1727204001.73963: generating all_blocks data 25675 1727204001.73966: done generating all_blocks data 25675 1727204001.73972: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 25675 1727204001.73978: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 25675 1727204001.73981: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 25675 1727204001.74372: done processing included file 25675 1727204001.74378: iterating over new_blocks loaded from include file 25675 1727204001.74380: in VariableManager get_vars() 25675 1727204001.74395: done with get_vars() 25675 1727204001.74397: filtering new block on tags 25675 1727204001.74416: done filtering new block on tags 25675 1727204001.74418: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed-node2 25675 1727204001.74423: extending task lists for all hosts with included blocks 25675 1727204001.74481: done extending task lists 25675 1727204001.74483: done processing included files 25675 1727204001.74483: results queue empty 25675 1727204001.74484: checking for any_errors_fatal 25675 1727204001.74486: done checking for any_errors_fatal 25675 1727204001.74486: checking for max_fail_percentage 25675 1727204001.74487: done checking for max_fail_percentage 25675 1727204001.74488: checking to see if all hosts have failed and the running result is not ok 25675 1727204001.74489: done checking to see if all hosts have failed 25675 1727204001.74490: getting the remaining hosts for this loop 25675 1727204001.74491: done getting the remaining hosts for this loop 25675 1727204001.74493: getting the next task for host managed-node2 25675 1727204001.74497: done getting next task for host managed-node2 25675 1727204001.74499: ^ task is: TASK: Assert that warnings is empty 25675 1727204001.74501: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204001.74503: getting variables 25675 1727204001.74504: in VariableManager get_vars() 25675 1727204001.74515: Calling all_inventory to load vars for managed-node2 25675 1727204001.74518: Calling groups_inventory to load vars for managed-node2 25675 1727204001.74520: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.74525: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.74528: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.74531: Calling groups_plugins_play to load vars for managed-node2 25675 1727204001.75797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204001.77424: done with get_vars() 25675 1727204001.77445: done getting variables 25675 1727204001.77498: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Tuesday 24 September 2024 14:53:21 -0400 (0:00:00.101) 0:00:21.226 ***** 25675 1727204001.77531: entering _queue_task() for managed-node2/assert 25675 1727204001.77952: worker is 1 (out of 1 available) 25675 1727204001.77964: exiting _queue_task() for managed-node2/assert 25675 1727204001.77977: done queuing things up, now waiting for results queue to drain 25675 1727204001.77978: waiting for pending results... 25675 1727204001.78198: running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty 25675 1727204001.78325: in run() - task 028d2410-947f-41bd-b19d-000000000304 25675 1727204001.78351: variable 'ansible_search_path' from source: unknown 25675 1727204001.78359: variable 'ansible_search_path' from source: unknown 25675 1727204001.78404: calling self._execute() 25675 1727204001.78536: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.78539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.78550: variable 'omit' from source: magic vars 25675 1727204001.78970: variable 'ansible_distribution_major_version' from source: facts 25675 1727204001.78981: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204001.78984: variable 'omit' from source: magic vars 25675 1727204001.79010: variable 'omit' from source: magic vars 25675 1727204001.79052: variable 'omit' from source: magic vars 25675 1727204001.79111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204001.79151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204001.79278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204001.79282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204001.79285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204001.79287: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204001.79290: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.79294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.79387: Set connection var ansible_shell_type to sh 25675 1727204001.79398: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204001.79420: Set connection var ansible_timeout to 10 25675 1727204001.79430: Set connection var ansible_pipelining to False 25675 1727204001.79440: Set connection var ansible_shell_executable to /bin/sh 25675 1727204001.79448: Set connection var ansible_connection to ssh 25675 1727204001.79484: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.79522: variable 'ansible_connection' from source: unknown 25675 1727204001.79526: variable 'ansible_module_compression' from source: unknown 25675 1727204001.79528: variable 'ansible_shell_type' from source: unknown 25675 1727204001.79530: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.79532: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.79534: variable 'ansible_pipelining' from source: unknown 25675 1727204001.79536: variable 'ansible_timeout' from source: unknown 25675 1727204001.79538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.79739: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204001.79743: variable 'omit' from source: magic vars 25675 1727204001.79745: starting attempt loop 25675 1727204001.79748: running the handler 25675 1727204001.79877: variable '__network_connections_result' from source: set_fact 25675 1727204001.79896: Evaluated conditional ('warnings' not in __network_connections_result): True 25675 1727204001.79906: handler run complete 25675 1727204001.79922: attempt loop complete, returning result 25675 1727204001.79955: _execute() done 25675 1727204001.79958: dumping result to json 25675 1727204001.79960: done dumping result, returning 25675 1727204001.79962: done running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty [028d2410-947f-41bd-b19d-000000000304] 25675 1727204001.79965: sending task result for task 028d2410-947f-41bd-b19d-000000000304 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 25675 1727204001.80112: no more pending results, returning what we have 25675 1727204001.80116: results queue empty 25675 1727204001.80117: checking for any_errors_fatal 25675 1727204001.80119: done checking for any_errors_fatal 25675 1727204001.80119: checking for max_fail_percentage 25675 1727204001.80122: done checking for max_fail_percentage 25675 1727204001.80123: checking to see if all hosts have failed and the running result is not ok 25675 1727204001.80124: done checking to see if all hosts have failed 25675 1727204001.80124: getting the remaining hosts for this loop 25675 1727204001.80126: done getting the remaining hosts for this loop 25675 1727204001.80130: getting the next task for host managed-node2 25675 1727204001.80140: done getting next task for host managed-node2 25675 1727204001.80143: ^ task is: TASK: Assert that there is output in stderr 25675 1727204001.80146: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204001.80151: getting variables 25675 1727204001.80153: in VariableManager get_vars() 25675 1727204001.80195: Calling all_inventory to load vars for managed-node2 25675 1727204001.80198: Calling groups_inventory to load vars for managed-node2 25675 1727204001.80201: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.80212: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.80216: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.80219: Calling groups_plugins_play to load vars for managed-node2 25675 1727204001.80997: done sending task result for task 028d2410-947f-41bd-b19d-000000000304 25675 1727204001.81001: WORKER PROCESS EXITING 25675 1727204001.89366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204001.93171: done with get_vars() 25675 1727204001.93208: done getting variables 25675 1727204001.93399: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Tuesday 24 September 2024 14:53:21 -0400 (0:00:00.158) 0:00:21.385 ***** 25675 1727204001.93426: entering _queue_task() for managed-node2/assert 25675 1727204001.94226: worker is 1 (out of 1 available) 25675 1727204001.94239: exiting _queue_task() for managed-node2/assert 25675 1727204001.94367: done queuing things up, now waiting for results queue to drain 25675 1727204001.94370: waiting for pending results... 25675 1727204001.95147: running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr 25675 1727204001.95252: in run() - task 028d2410-947f-41bd-b19d-000000000305 25675 1727204001.95270: variable 'ansible_search_path' from source: unknown 25675 1727204001.95277: variable 'ansible_search_path' from source: unknown 25675 1727204001.95544: calling self._execute() 25675 1727204001.95629: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.95633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.95643: variable 'omit' from source: magic vars 25675 1727204001.96463: variable 'ansible_distribution_major_version' from source: facts 25675 1727204001.96467: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204001.96470: variable 'omit' from source: magic vars 25675 1727204001.96699: variable 'omit' from source: magic vars 25675 1727204001.96762: variable 'omit' from source: magic vars 25675 1727204001.96780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204001.96828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204001.96833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204001.96878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204001.96883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204001.97096: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204001.97100: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.97103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.97202: Set connection var ansible_shell_type to sh 25675 1727204001.97206: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204001.97216: Set connection var ansible_timeout to 10 25675 1727204001.97219: Set connection var ansible_pipelining to False 25675 1727204001.97278: Set connection var ansible_shell_executable to /bin/sh 25675 1727204001.97283: Set connection var ansible_connection to ssh 25675 1727204001.97285: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.97287: variable 'ansible_connection' from source: unknown 25675 1727204001.97290: variable 'ansible_module_compression' from source: unknown 25675 1727204001.97291: variable 'ansible_shell_type' from source: unknown 25675 1727204001.97293: variable 'ansible_shell_executable' from source: unknown 25675 1727204001.97295: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204001.97297: variable 'ansible_pipelining' from source: unknown 25675 1727204001.97299: variable 'ansible_timeout' from source: unknown 25675 1727204001.97303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204001.97981: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204001.97985: variable 'omit' from source: magic vars 25675 1727204001.97987: starting attempt loop 25675 1727204001.97989: running the handler 25675 1727204001.98388: variable '__network_connections_result' from source: set_fact 25675 1727204001.98392: Evaluated conditional ('stderr' in __network_connections_result): True 25675 1727204001.98394: handler run complete 25675 1727204001.98423: attempt loop complete, returning result 25675 1727204001.98426: _execute() done 25675 1727204001.98429: dumping result to json 25675 1727204001.98432: done dumping result, returning 25675 1727204001.98434: done running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr [028d2410-947f-41bd-b19d-000000000305] 25675 1727204001.98436: sending task result for task 028d2410-947f-41bd-b19d-000000000305 25675 1727204001.98787: done sending task result for task 028d2410-947f-41bd-b19d-000000000305 25675 1727204001.98791: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 25675 1727204001.98836: no more pending results, returning what we have 25675 1727204001.98839: results queue empty 25675 1727204001.98840: checking for any_errors_fatal 25675 1727204001.98846: done checking for any_errors_fatal 25675 1727204001.98847: checking for max_fail_percentage 25675 1727204001.98848: done checking for max_fail_percentage 25675 1727204001.98849: checking to see if all hosts have failed and the running result is not ok 25675 1727204001.98850: done checking to see if all hosts have failed 25675 1727204001.98850: getting the remaining hosts for this loop 25675 1727204001.98853: done getting the remaining hosts for this loop 25675 1727204001.98856: getting the next task for host managed-node2 25675 1727204001.98863: done getting next task for host managed-node2 25675 1727204001.98865: ^ task is: TASK: meta (flush_handlers) 25675 1727204001.98866: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204001.98870: getting variables 25675 1727204001.98872: in VariableManager get_vars() 25675 1727204001.99012: Calling all_inventory to load vars for managed-node2 25675 1727204001.99015: Calling groups_inventory to load vars for managed-node2 25675 1727204001.99017: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204001.99027: Calling all_plugins_play to load vars for managed-node2 25675 1727204001.99029: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204001.99032: Calling groups_plugins_play to load vars for managed-node2 25675 1727204002.01809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204002.05083: done with get_vars() 25675 1727204002.05108: done getting variables 25675 1727204002.05169: in VariableManager get_vars() 25675 1727204002.05297: Calling all_inventory to load vars for managed-node2 25675 1727204002.05300: Calling groups_inventory to load vars for managed-node2 25675 1727204002.05302: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204002.05308: Calling all_plugins_play to load vars for managed-node2 25675 1727204002.05310: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204002.05313: Calling groups_plugins_play to load vars for managed-node2 25675 1727204002.07016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204002.10171: done with get_vars() 25675 1727204002.10619: done queuing things up, now waiting for results queue to drain 25675 1727204002.10622: results queue empty 25675 1727204002.10623: checking for any_errors_fatal 25675 1727204002.10626: done checking for any_errors_fatal 25675 1727204002.10627: checking for max_fail_percentage 25675 1727204002.10628: done checking for max_fail_percentage 25675 1727204002.10629: checking to see if all hosts have failed and the running result is not ok 25675 1727204002.10630: done checking to see if all hosts have failed 25675 1727204002.10630: getting the remaining hosts for this loop 25675 1727204002.10637: done getting the remaining hosts for this loop 25675 1727204002.10641: getting the next task for host managed-node2 25675 1727204002.10646: done getting next task for host managed-node2 25675 1727204002.10647: ^ task is: TASK: meta (flush_handlers) 25675 1727204002.10649: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204002.10652: getting variables 25675 1727204002.10653: in VariableManager get_vars() 25675 1727204002.10667: Calling all_inventory to load vars for managed-node2 25675 1727204002.10670: Calling groups_inventory to load vars for managed-node2 25675 1727204002.10675: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204002.10682: Calling all_plugins_play to load vars for managed-node2 25675 1727204002.10685: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204002.10688: Calling groups_plugins_play to load vars for managed-node2 25675 1727204002.12490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204002.14229: done with get_vars() 25675 1727204002.14249: done getting variables 25675 1727204002.14319: in VariableManager get_vars() 25675 1727204002.14332: Calling all_inventory to load vars for managed-node2 25675 1727204002.14334: Calling groups_inventory to load vars for managed-node2 25675 1727204002.14336: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204002.14341: Calling all_plugins_play to load vars for managed-node2 25675 1727204002.14343: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204002.14346: Calling groups_plugins_play to load vars for managed-node2 25675 1727204002.15591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204002.17456: done with get_vars() 25675 1727204002.17497: done queuing things up, now waiting for results queue to drain 25675 1727204002.17499: results queue empty 25675 1727204002.17500: checking for any_errors_fatal 25675 1727204002.17501: done checking for any_errors_fatal 25675 1727204002.17502: checking for max_fail_percentage 25675 1727204002.17503: done checking for max_fail_percentage 25675 1727204002.17504: checking to see if all hosts have failed and the running result is not ok 25675 1727204002.17504: done checking to see if all hosts have failed 25675 1727204002.17505: getting the remaining hosts for this loop 25675 1727204002.17506: done getting the remaining hosts for this loop 25675 1727204002.17509: getting the next task for host managed-node2 25675 1727204002.17512: done getting next task for host managed-node2 25675 1727204002.17513: ^ task is: None 25675 1727204002.17514: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204002.17515: done queuing things up, now waiting for results queue to drain 25675 1727204002.17516: results queue empty 25675 1727204002.17517: checking for any_errors_fatal 25675 1727204002.17517: done checking for any_errors_fatal 25675 1727204002.17518: checking for max_fail_percentage 25675 1727204002.17519: done checking for max_fail_percentage 25675 1727204002.17519: checking to see if all hosts have failed and the running result is not ok 25675 1727204002.17520: done checking to see if all hosts have failed 25675 1727204002.17522: getting the next task for host managed-node2 25675 1727204002.17524: done getting next task for host managed-node2 25675 1727204002.17525: ^ task is: None 25675 1727204002.17526: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204002.17587: in VariableManager get_vars() 25675 1727204002.17603: done with get_vars() 25675 1727204002.17609: in VariableManager get_vars() 25675 1727204002.17618: done with get_vars() 25675 1727204002.17622: variable 'omit' from source: magic vars 25675 1727204002.17650: in VariableManager get_vars() 25675 1727204002.17659: done with get_vars() 25675 1727204002.17706: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 25675 1727204002.17931: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204002.17960: getting the remaining hosts for this loop 25675 1727204002.17962: done getting the remaining hosts for this loop 25675 1727204002.17964: getting the next task for host managed-node2 25675 1727204002.17967: done getting next task for host managed-node2 25675 1727204002.17969: ^ task is: TASK: Gathering Facts 25675 1727204002.17970: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204002.17972: getting variables 25675 1727204002.17984: in VariableManager get_vars() 25675 1727204002.18001: Calling all_inventory to load vars for managed-node2 25675 1727204002.18004: Calling groups_inventory to load vars for managed-node2 25675 1727204002.18006: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204002.18011: Calling all_plugins_play to load vars for managed-node2 25675 1727204002.18014: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204002.18023: Calling groups_plugins_play to load vars for managed-node2 25675 1727204002.19663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204002.21306: done with get_vars() 25675 1727204002.21335: done getting variables 25675 1727204002.21385: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Tuesday 24 September 2024 14:53:22 -0400 (0:00:00.279) 0:00:21.665 ***** 25675 1727204002.21409: entering _queue_task() for managed-node2/gather_facts 25675 1727204002.21755: worker is 1 (out of 1 available) 25675 1727204002.21768: exiting _queue_task() for managed-node2/gather_facts 25675 1727204002.21982: done queuing things up, now waiting for results queue to drain 25675 1727204002.21984: waiting for pending results... 25675 1727204002.22113: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204002.22161: in run() - task 028d2410-947f-41bd-b19d-000000000316 25675 1727204002.22185: variable 'ansible_search_path' from source: unknown 25675 1727204002.22228: calling self._execute() 25675 1727204002.22327: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204002.22349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204002.22364: variable 'omit' from source: magic vars 25675 1727204002.22801: variable 'ansible_distribution_major_version' from source: facts 25675 1727204002.22863: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204002.22866: variable 'omit' from source: magic vars 25675 1727204002.22871: variable 'omit' from source: magic vars 25675 1727204002.22906: variable 'omit' from source: magic vars 25675 1727204002.22950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204002.22995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204002.23021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204002.23042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204002.23059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204002.23180: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204002.23185: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204002.23188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204002.23216: Set connection var ansible_shell_type to sh 25675 1727204002.23229: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204002.23240: Set connection var ansible_timeout to 10 25675 1727204002.23251: Set connection var ansible_pipelining to False 25675 1727204002.23302: Set connection var ansible_shell_executable to /bin/sh 25675 1727204002.23306: Set connection var ansible_connection to ssh 25675 1727204002.23308: variable 'ansible_shell_executable' from source: unknown 25675 1727204002.23310: variable 'ansible_connection' from source: unknown 25675 1727204002.23312: variable 'ansible_module_compression' from source: unknown 25675 1727204002.23320: variable 'ansible_shell_type' from source: unknown 25675 1727204002.23326: variable 'ansible_shell_executable' from source: unknown 25675 1727204002.23332: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204002.23341: variable 'ansible_pipelining' from source: unknown 25675 1727204002.23348: variable 'ansible_timeout' from source: unknown 25675 1727204002.23356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204002.23552: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204002.23580: variable 'omit' from source: magic vars 25675 1727204002.23583: starting attempt loop 25675 1727204002.23585: running the handler 25675 1727204002.23628: variable 'ansible_facts' from source: unknown 25675 1727204002.23632: _low_level_execute_command(): starting 25675 1727204002.23647: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204002.24332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204002.24401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204002.24460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204002.24482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204002.24506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204002.24692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204002.26421: stdout chunk (state=3): >>>/root <<< 25675 1727204002.26522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204002.26526: stdout chunk (state=3): >>><<< 25675 1727204002.26529: stderr chunk (state=3): >>><<< 25675 1727204002.26653: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204002.26657: _low_level_execute_command(): starting 25675 1727204002.26659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491 `" && echo ansible-tmp-1727204002.265536-27418-86772236058491="` echo /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491 `" ) && sleep 0' 25675 1727204002.27191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204002.27212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204002.27286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204002.27331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204002.27394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204002.27424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204002.27459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204002.27616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204002.29499: stdout chunk (state=3): >>>ansible-tmp-1727204002.265536-27418-86772236058491=/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491 <<< 25675 1727204002.29563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204002.29881: stderr chunk (state=3): >>><<< 25675 1727204002.29885: stdout chunk (state=3): >>><<< 25675 1727204002.29887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204002.265536-27418-86772236058491=/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204002.29890: variable 'ansible_module_compression' from source: unknown 25675 1727204002.29906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204002.29973: variable 'ansible_facts' from source: unknown 25675 1727204002.30294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py 25675 1727204002.30818: Sending initial data 25675 1727204002.30821: Sent initial data (152 bytes) 25675 1727204002.32564: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204002.32586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204002.32958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204002.33092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204002.33419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204002.35077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204002.35200: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204002.35288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpb3y7sspd /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py <<< 25675 1727204002.35349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py" <<< 25675 1727204002.35408: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpb3y7sspd" to remote "/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py" <<< 25675 1727204002.38712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204002.38716: stdout chunk (state=3): >>><<< 25675 1727204002.38719: stderr chunk (state=3): >>><<< 25675 1727204002.38721: done transferring module to remote 25675 1727204002.38725: _low_level_execute_command(): starting 25675 1727204002.38728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/ /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py && sleep 0' 25675 1727204002.39957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204002.39961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204002.40078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204002.40082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204002.40148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204002.42081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204002.42085: stdout chunk (state=3): >>><<< 25675 1727204002.42087: stderr chunk (state=3): >>><<< 25675 1727204002.42185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204002.42193: _low_level_execute_command(): starting 25675 1727204002.42196: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/AnsiballZ_setup.py && sleep 0' 25675 1727204002.42696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204002.42711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204002.42725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204002.42745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204002.42814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204002.42897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204002.43027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204002.43202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204002.43268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.07668: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "22", "epoch": "1727204002", "epoch_int": "1727204002", "date": "2024-09-24", "time": "14:53:22", "iso8601_micro": "2024-09-24T18:53:22.703158Z", "iso8601": "2024-09-24T18:53:22Z", "iso8601_basic": "20240924T145322703158", "iso8601_basic_short": "20240924T145322", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.49755859375, "5m": 0.42919921875, "15m": 0.2275390625}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", <<< 25675 1727204003.07744: stdout chunk (state=3): >>>"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2918, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 613, "free": 2918}, "nocache": {"free": 3275, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 589, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785686016, "block_size": 4096, "block_total": 65519099, "block_available": 63912521, "block_used": 1606578, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204003.09986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204003.09990: stderr chunk (state=3): >>><<< 25675 1727204003.09992: stdout chunk (state=3): >>><<< 25675 1727204003.09995: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "22", "epoch": "1727204002", "epoch_int": "1727204002", "date": "2024-09-24", "time": "14:53:22", "iso8601_micro": "2024-09-24T18:53:22.703158Z", "iso8601": "2024-09-24T18:53:22Z", "iso8601_basic": "20240924T145322703158", "iso8601_basic_short": "20240924T145322", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.49755859375, "5m": 0.42919921875, "15m": 0.2275390625}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2918, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 613, "free": 2918}, "nocache": {"free": 3275, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 589, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785686016, "block_size": 4096, "block_total": 65519099, "block_available": 63912521, "block_used": 1606578, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204003.10939: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204003.10943: _low_level_execute_command(): starting 25675 1727204003.10946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204002.265536-27418-86772236058491/ > /dev/null 2>&1 && sleep 0' 25675 1727204003.11771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204003.11799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204003.11815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204003.11900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.11939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204003.11955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.12028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.12136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.14066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204003.14094: stdout chunk (state=3): >>><<< 25675 1727204003.14097: stderr chunk (state=3): >>><<< 25675 1727204003.14113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204003.14280: handler run complete 25675 1727204003.14283: variable 'ansible_facts' from source: unknown 25675 1727204003.14395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.14761: variable 'ansible_facts' from source: unknown 25675 1727204003.14879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.15036: attempt loop complete, returning result 25675 1727204003.15044: _execute() done 25675 1727204003.15049: dumping result to json 25675 1727204003.15092: done dumping result, returning 25675 1727204003.15103: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-000000000316] 25675 1727204003.15111: sending task result for task 028d2410-947f-41bd-b19d-000000000316 25675 1727204003.15681: done sending task result for task 028d2410-947f-41bd-b19d-000000000316 25675 1727204003.15684: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727204003.16198: no more pending results, returning what we have 25675 1727204003.16202: results queue empty 25675 1727204003.16202: checking for any_errors_fatal 25675 1727204003.16204: done checking for any_errors_fatal 25675 1727204003.16204: checking for max_fail_percentage 25675 1727204003.16206: done checking for max_fail_percentage 25675 1727204003.16207: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.16208: done checking to see if all hosts have failed 25675 1727204003.16209: getting the remaining hosts for this loop 25675 1727204003.16210: done getting the remaining hosts for this loop 25675 1727204003.16221: getting the next task for host managed-node2 25675 1727204003.16226: done getting next task for host managed-node2 25675 1727204003.16228: ^ task is: TASK: meta (flush_handlers) 25675 1727204003.16230: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.16234: getting variables 25675 1727204003.16235: in VariableManager get_vars() 25675 1727204003.16256: Calling all_inventory to load vars for managed-node2 25675 1727204003.16259: Calling groups_inventory to load vars for managed-node2 25675 1727204003.16262: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.16273: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.16301: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.16306: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.17655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.19182: done with get_vars() 25675 1727204003.19205: done getting variables 25675 1727204003.19264: in VariableManager get_vars() 25675 1727204003.19272: Calling all_inventory to load vars for managed-node2 25675 1727204003.19274: Calling groups_inventory to load vars for managed-node2 25675 1727204003.19278: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.19283: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.19285: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.19288: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.20416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.22082: done with get_vars() 25675 1727204003.22109: done queuing things up, now waiting for results queue to drain 25675 1727204003.22112: results queue empty 25675 1727204003.22113: checking for any_errors_fatal 25675 1727204003.22117: done checking for any_errors_fatal 25675 1727204003.22118: checking for max_fail_percentage 25675 1727204003.22124: done checking for max_fail_percentage 25675 1727204003.22125: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.22126: done checking to see if all hosts have failed 25675 1727204003.22126: getting the remaining hosts for this loop 25675 1727204003.22127: done getting the remaining hosts for this loop 25675 1727204003.22130: getting the next task for host managed-node2 25675 1727204003.22134: done getting next task for host managed-node2 25675 1727204003.22136: ^ task is: TASK: Show network_provider 25675 1727204003.22138: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.22140: getting variables 25675 1727204003.22141: in VariableManager get_vars() 25675 1727204003.22150: Calling all_inventory to load vars for managed-node2 25675 1727204003.22152: Calling groups_inventory to load vars for managed-node2 25675 1727204003.22155: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.22160: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.22163: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.22166: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.23312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.24827: done with get_vars() 25675 1727204003.24849: done getting variables 25675 1727204003.24894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Tuesday 24 September 2024 14:53:23 -0400 (0:00:01.035) 0:00:22.700 ***** 25675 1727204003.24922: entering _queue_task() for managed-node2/debug 25675 1727204003.25256: worker is 1 (out of 1 available) 25675 1727204003.25267: exiting _queue_task() for managed-node2/debug 25675 1727204003.25481: done queuing things up, now waiting for results queue to drain 25675 1727204003.25483: waiting for pending results... 25675 1727204003.25541: running TaskExecutor() for managed-node2/TASK: Show network_provider 25675 1727204003.25684: in run() - task 028d2410-947f-41bd-b19d-000000000033 25675 1727204003.25688: variable 'ansible_search_path' from source: unknown 25675 1727204003.25714: calling self._execute() 25675 1727204003.25798: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.25881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.25884: variable 'omit' from source: magic vars 25675 1727204003.26199: variable 'ansible_distribution_major_version' from source: facts 25675 1727204003.26215: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204003.26226: variable 'omit' from source: magic vars 25675 1727204003.26263: variable 'omit' from source: magic vars 25675 1727204003.26299: variable 'omit' from source: magic vars 25675 1727204003.26338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204003.26383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204003.26408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204003.26429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204003.26446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204003.26483: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204003.26490: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.26496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.26682: Set connection var ansible_shell_type to sh 25675 1727204003.26686: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204003.26688: Set connection var ansible_timeout to 10 25675 1727204003.26690: Set connection var ansible_pipelining to False 25675 1727204003.26691: Set connection var ansible_shell_executable to /bin/sh 25675 1727204003.26693: Set connection var ansible_connection to ssh 25675 1727204003.26695: variable 'ansible_shell_executable' from source: unknown 25675 1727204003.26696: variable 'ansible_connection' from source: unknown 25675 1727204003.26698: variable 'ansible_module_compression' from source: unknown 25675 1727204003.26700: variable 'ansible_shell_type' from source: unknown 25675 1727204003.26702: variable 'ansible_shell_executable' from source: unknown 25675 1727204003.26703: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.26705: variable 'ansible_pipelining' from source: unknown 25675 1727204003.26707: variable 'ansible_timeout' from source: unknown 25675 1727204003.26708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.26807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204003.26823: variable 'omit' from source: magic vars 25675 1727204003.26833: starting attempt loop 25675 1727204003.26839: running the handler 25675 1727204003.26904: variable 'network_provider' from source: set_fact 25675 1727204003.26982: variable 'network_provider' from source: set_fact 25675 1727204003.26997: handler run complete 25675 1727204003.27022: attempt loop complete, returning result 25675 1727204003.27028: _execute() done 25675 1727204003.27034: dumping result to json 25675 1727204003.27041: done dumping result, returning 25675 1727204003.27049: done running TaskExecutor() for managed-node2/TASK: Show network_provider [028d2410-947f-41bd-b19d-000000000033] 25675 1727204003.27056: sending task result for task 028d2410-947f-41bd-b19d-000000000033 ok: [managed-node2] => { "network_provider": "nm" } 25675 1727204003.27258: no more pending results, returning what we have 25675 1727204003.27261: results queue empty 25675 1727204003.27262: checking for any_errors_fatal 25675 1727204003.27265: done checking for any_errors_fatal 25675 1727204003.27265: checking for max_fail_percentage 25675 1727204003.27267: done checking for max_fail_percentage 25675 1727204003.27268: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.27269: done checking to see if all hosts have failed 25675 1727204003.27270: getting the remaining hosts for this loop 25675 1727204003.27272: done getting the remaining hosts for this loop 25675 1727204003.27278: getting the next task for host managed-node2 25675 1727204003.27285: done getting next task for host managed-node2 25675 1727204003.27287: ^ task is: TASK: meta (flush_handlers) 25675 1727204003.27289: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.27294: getting variables 25675 1727204003.27296: in VariableManager get_vars() 25675 1727204003.27325: Calling all_inventory to load vars for managed-node2 25675 1727204003.27328: Calling groups_inventory to load vars for managed-node2 25675 1727204003.27332: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.27343: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.27345: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.27348: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.27913: done sending task result for task 028d2410-947f-41bd-b19d-000000000033 25675 1727204003.27916: WORKER PROCESS EXITING 25675 1727204003.28983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.30538: done with get_vars() 25675 1727204003.30562: done getting variables 25675 1727204003.30632: in VariableManager get_vars() 25675 1727204003.30642: Calling all_inventory to load vars for managed-node2 25675 1727204003.30644: Calling groups_inventory to load vars for managed-node2 25675 1727204003.30647: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.30651: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.30654: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.30656: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.31786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.33343: done with get_vars() 25675 1727204003.33369: done queuing things up, now waiting for results queue to drain 25675 1727204003.33371: results queue empty 25675 1727204003.33372: checking for any_errors_fatal 25675 1727204003.33374: done checking for any_errors_fatal 25675 1727204003.33377: checking for max_fail_percentage 25675 1727204003.33378: done checking for max_fail_percentage 25675 1727204003.33378: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.33379: done checking to see if all hosts have failed 25675 1727204003.33380: getting the remaining hosts for this loop 25675 1727204003.33381: done getting the remaining hosts for this loop 25675 1727204003.33383: getting the next task for host managed-node2 25675 1727204003.33393: done getting next task for host managed-node2 25675 1727204003.33395: ^ task is: TASK: meta (flush_handlers) 25675 1727204003.33396: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.33399: getting variables 25675 1727204003.33400: in VariableManager get_vars() 25675 1727204003.33408: Calling all_inventory to load vars for managed-node2 25675 1727204003.33411: Calling groups_inventory to load vars for managed-node2 25675 1727204003.33413: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.33418: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.33420: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.33423: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.34714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.36908: done with get_vars() 25675 1727204003.36935: done getting variables 25675 1727204003.37007: in VariableManager get_vars() 25675 1727204003.37017: Calling all_inventory to load vars for managed-node2 25675 1727204003.37020: Calling groups_inventory to load vars for managed-node2 25675 1727204003.37022: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.37027: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.37029: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.37032: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.38528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.41342: done with get_vars() 25675 1727204003.41381: done queuing things up, now waiting for results queue to drain 25675 1727204003.41383: results queue empty 25675 1727204003.41384: checking for any_errors_fatal 25675 1727204003.41385: done checking for any_errors_fatal 25675 1727204003.41386: checking for max_fail_percentage 25675 1727204003.41387: done checking for max_fail_percentage 25675 1727204003.41388: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.41389: done checking to see if all hosts have failed 25675 1727204003.41389: getting the remaining hosts for this loop 25675 1727204003.41390: done getting the remaining hosts for this loop 25675 1727204003.41393: getting the next task for host managed-node2 25675 1727204003.41396: done getting next task for host managed-node2 25675 1727204003.41397: ^ task is: None 25675 1727204003.41398: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.41399: done queuing things up, now waiting for results queue to drain 25675 1727204003.41400: results queue empty 25675 1727204003.41401: checking for any_errors_fatal 25675 1727204003.41401: done checking for any_errors_fatal 25675 1727204003.41402: checking for max_fail_percentage 25675 1727204003.41403: done checking for max_fail_percentage 25675 1727204003.41404: checking to see if all hosts have failed and the running result is not ok 25675 1727204003.41404: done checking to see if all hosts have failed 25675 1727204003.41406: getting the next task for host managed-node2 25675 1727204003.41407: done getting next task for host managed-node2 25675 1727204003.41408: ^ task is: None 25675 1727204003.41409: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.41567: in VariableManager get_vars() 25675 1727204003.41595: done with get_vars() 25675 1727204003.41602: in VariableManager get_vars() 25675 1727204003.41616: done with get_vars() 25675 1727204003.41620: variable 'omit' from source: magic vars 25675 1727204003.41984: variable 'profile' from source: play vars 25675 1727204003.42248: in VariableManager get_vars() 25675 1727204003.42268: done with get_vars() 25675 1727204003.42296: variable 'omit' from source: magic vars 25675 1727204003.42473: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 25675 1727204003.44484: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204003.44535: getting the remaining hosts for this loop 25675 1727204003.44537: done getting the remaining hosts for this loop 25675 1727204003.44539: getting the next task for host managed-node2 25675 1727204003.44542: done getting next task for host managed-node2 25675 1727204003.44544: ^ task is: TASK: Gathering Facts 25675 1727204003.44546: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204003.44548: getting variables 25675 1727204003.44549: in VariableManager get_vars() 25675 1727204003.44564: Calling all_inventory to load vars for managed-node2 25675 1727204003.44566: Calling groups_inventory to load vars for managed-node2 25675 1727204003.44568: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204003.44691: Calling all_plugins_play to load vars for managed-node2 25675 1727204003.44695: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204003.44699: Calling groups_plugins_play to load vars for managed-node2 25675 1727204003.48219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204003.52549: done with get_vars() 25675 1727204003.52591: done getting variables 25675 1727204003.52961: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.281) 0:00:22.981 ***** 25675 1727204003.53036: entering _queue_task() for managed-node2/gather_facts 25675 1727204003.53945: worker is 1 (out of 1 available) 25675 1727204003.54002: exiting _queue_task() for managed-node2/gather_facts 25675 1727204003.54125: done queuing things up, now waiting for results queue to drain 25675 1727204003.54127: waiting for pending results... 25675 1727204003.54316: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204003.54688: in run() - task 028d2410-947f-41bd-b19d-00000000032b 25675 1727204003.54693: variable 'ansible_search_path' from source: unknown 25675 1727204003.54696: calling self._execute() 25675 1727204003.54781: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.54805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.54848: variable 'omit' from source: magic vars 25675 1727204003.55661: variable 'ansible_distribution_major_version' from source: facts 25675 1727204003.55770: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204003.55778: variable 'omit' from source: magic vars 25675 1727204003.55781: variable 'omit' from source: magic vars 25675 1727204003.55786: variable 'omit' from source: magic vars 25675 1727204003.55824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204003.55880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204003.55988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204003.55992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204003.55995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204003.55997: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204003.56082: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.56085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.56136: Set connection var ansible_shell_type to sh 25675 1727204003.56232: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204003.56242: Set connection var ansible_timeout to 10 25675 1727204003.56252: Set connection var ansible_pipelining to False 25675 1727204003.56261: Set connection var ansible_shell_executable to /bin/sh 25675 1727204003.56268: Set connection var ansible_connection to ssh 25675 1727204003.56305: variable 'ansible_shell_executable' from source: unknown 25675 1727204003.56313: variable 'ansible_connection' from source: unknown 25675 1727204003.56383: variable 'ansible_module_compression' from source: unknown 25675 1727204003.56387: variable 'ansible_shell_type' from source: unknown 25675 1727204003.56389: variable 'ansible_shell_executable' from source: unknown 25675 1727204003.56392: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204003.56395: variable 'ansible_pipelining' from source: unknown 25675 1727204003.56397: variable 'ansible_timeout' from source: unknown 25675 1727204003.56399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204003.56565: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204003.56627: variable 'omit' from source: magic vars 25675 1727204003.56641: starting attempt loop 25675 1727204003.56657: running the handler 25675 1727204003.56708: variable 'ansible_facts' from source: unknown 25675 1727204003.56712: _low_level_execute_command(): starting 25675 1727204003.56720: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204003.57890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204003.57911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204003.57929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204003.57950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204003.57994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.58198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204003.58214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.58243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.58355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.60171: stdout chunk (state=3): >>>/root <<< 25675 1727204003.60230: stdout chunk (state=3): >>><<< 25675 1727204003.60506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204003.60510: stderr chunk (state=3): >>><<< 25675 1727204003.60516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204003.60523: _low_level_execute_command(): starting 25675 1727204003.60527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302 `" && echo ansible-tmp-1727204003.6041377-27685-266383851802302="` echo /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302 `" ) && sleep 0' 25675 1727204003.61589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204003.61889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.62095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.62319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.64296: stdout chunk (state=3): >>>ansible-tmp-1727204003.6041377-27685-266383851802302=/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302 <<< 25675 1727204003.64466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204003.64470: stdout chunk (state=3): >>><<< 25675 1727204003.64473: stderr chunk (state=3): >>><<< 25675 1727204003.64499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204003.6041377-27685-266383851802302=/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204003.64541: variable 'ansible_module_compression' from source: unknown 25675 1727204003.64701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204003.64867: variable 'ansible_facts' from source: unknown 25675 1727204003.65529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py 25675 1727204003.66071: Sending initial data 25675 1727204003.66080: Sent initial data (154 bytes) 25675 1727204003.67349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.67533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.67635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.67738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.69433: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204003.69506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204003.69582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmprg0pbrfk /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py <<< 25675 1727204003.69597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py" <<< 25675 1727204003.69648: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmprg0pbrfk" to remote "/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py" <<< 25675 1727204003.72490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204003.72494: stdout chunk (state=3): >>><<< 25675 1727204003.72496: stderr chunk (state=3): >>><<< 25675 1727204003.72498: done transferring module to remote 25675 1727204003.72512: _low_level_execute_command(): starting 25675 1727204003.72597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/ /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py && sleep 0' 25675 1727204003.73816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.73834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204003.73903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.74006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204003.74052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.74081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.74204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204003.76203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204003.76207: stdout chunk (state=3): >>><<< 25675 1727204003.76209: stderr chunk (state=3): >>><<< 25675 1727204003.76273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204003.76357: _low_level_execute_command(): starting 25675 1727204003.76361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/AnsiballZ_setup.py && sleep 0' 25675 1727204003.77547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204003.77551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204003.77554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204003.77556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204003.77800: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204003.77907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204003.77910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204003.78580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204003.78661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204004.44701: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "24", "epoch": "1727204004", "epoch_int": "1727204004", "date": "2024-09-24", "time": "14:53:24", "iso8601_micro": "2024-09-24T18:53:24.050538Z", "iso8601": "2024-09-24T18:53:24Z", "iso8601_basic": "20240924T145324050538", "iso8601_basic_short": "20240924T145324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2924, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 607, "free": 2924}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 590, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785686016, "block_size": 4096, "block_total": 65519099, "block_available": 63912521, "block_used": 1606578, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.43896484375, "15m": 0.23193359375}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag<<< 25675 1727204004.44901: stdout chunk (state=3): >>>_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204004.46821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204004.46825: stdout chunk (state=3): >>><<< 25675 1727204004.46828: stderr chunk (state=3): >>><<< 25675 1727204004.46907: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "24", "epoch": "1727204004", "epoch_int": "1727204004", "date": "2024-09-24", "time": "14:53:24", "iso8601_micro": "2024-09-24T18:53:24.050538Z", "iso8601": "2024-09-24T18:53:24Z", "iso8601_basic": "20240924T145324050538", "iso8601_basic_short": "20240924T145324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2924, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 607, "free": 2924}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 590, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785686016, "block_size": 4096, "block_total": 65519099, "block_available": 63912521, "block_used": 1606578, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.43896484375, "15m": 0.23193359375}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204004.47823: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204004.47981: _low_level_execute_command(): starting 25675 1727204004.47984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204003.6041377-27685-266383851802302/ > /dev/null 2>&1 && sleep 0' 25675 1727204004.49111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204004.49171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204004.49191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204004.49208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204004.49370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204004.49496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204004.49682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204004.51615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204004.51624: stdout chunk (state=3): >>><<< 25675 1727204004.51683: stderr chunk (state=3): >>><<< 25675 1727204004.51699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204004.51712: handler run complete 25675 1727204004.52281: variable 'ansible_facts' from source: unknown 25675 1727204004.52284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.53018: variable 'ansible_facts' from source: unknown 25675 1727204004.53480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.53672: attempt loop complete, returning result 25675 1727204004.53686: _execute() done 25675 1727204004.53694: dumping result to json 25675 1727204004.53742: done dumping result, returning 25675 1727204004.53834: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-00000000032b] 25675 1727204004.53843: sending task result for task 028d2410-947f-41bd-b19d-00000000032b 25675 1727204004.54909: done sending task result for task 028d2410-947f-41bd-b19d-00000000032b 25675 1727204004.54912: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727204004.55603: no more pending results, returning what we have 25675 1727204004.55606: results queue empty 25675 1727204004.55607: checking for any_errors_fatal 25675 1727204004.55608: done checking for any_errors_fatal 25675 1727204004.55609: checking for max_fail_percentage 25675 1727204004.55611: done checking for max_fail_percentage 25675 1727204004.55612: checking to see if all hosts have failed and the running result is not ok 25675 1727204004.55613: done checking to see if all hosts have failed 25675 1727204004.55613: getting the remaining hosts for this loop 25675 1727204004.55615: done getting the remaining hosts for this loop 25675 1727204004.55619: getting the next task for host managed-node2 25675 1727204004.55623: done getting next task for host managed-node2 25675 1727204004.55625: ^ task is: TASK: meta (flush_handlers) 25675 1727204004.55627: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204004.55631: getting variables 25675 1727204004.55632: in VariableManager get_vars() 25675 1727204004.55663: Calling all_inventory to load vars for managed-node2 25675 1727204004.55665: Calling groups_inventory to load vars for managed-node2 25675 1727204004.55667: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204004.55883: Calling all_plugins_play to load vars for managed-node2 25675 1727204004.55887: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204004.55891: Calling groups_plugins_play to load vars for managed-node2 25675 1727204004.58710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.61932: done with get_vars() 25675 1727204004.61958: done getting variables 25675 1727204004.62132: in VariableManager get_vars() 25675 1727204004.62145: Calling all_inventory to load vars for managed-node2 25675 1727204004.62147: Calling groups_inventory to load vars for managed-node2 25675 1727204004.62149: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204004.62154: Calling all_plugins_play to load vars for managed-node2 25675 1727204004.62156: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204004.62159: Calling groups_plugins_play to load vars for managed-node2 25675 1727204004.64422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.67651: done with get_vars() 25675 1727204004.67893: done queuing things up, now waiting for results queue to drain 25675 1727204004.67896: results queue empty 25675 1727204004.67897: checking for any_errors_fatal 25675 1727204004.67901: done checking for any_errors_fatal 25675 1727204004.67902: checking for max_fail_percentage 25675 1727204004.67903: done checking for max_fail_percentage 25675 1727204004.67909: checking to see if all hosts have failed and the running result is not ok 25675 1727204004.67910: done checking to see if all hosts have failed 25675 1727204004.67910: getting the remaining hosts for this loop 25675 1727204004.67911: done getting the remaining hosts for this loop 25675 1727204004.67914: getting the next task for host managed-node2 25675 1727204004.67918: done getting next task for host managed-node2 25675 1727204004.67921: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727204004.67922: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204004.67932: getting variables 25675 1727204004.67933: in VariableManager get_vars() 25675 1727204004.67948: Calling all_inventory to load vars for managed-node2 25675 1727204004.67950: Calling groups_inventory to load vars for managed-node2 25675 1727204004.67952: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204004.67958: Calling all_plugins_play to load vars for managed-node2 25675 1727204004.67960: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204004.67962: Calling groups_plugins_play to load vars for managed-node2 25675 1727204004.70745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.74337: done with get_vars() 25675 1727204004.74380: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:24 -0400 (0:00:01.214) 0:00:24.196 ***** 25675 1727204004.74511: entering _queue_task() for managed-node2/include_tasks 25675 1727204004.74922: worker is 1 (out of 1 available) 25675 1727204004.74935: exiting _queue_task() for managed-node2/include_tasks 25675 1727204004.74946: done queuing things up, now waiting for results queue to drain 25675 1727204004.74948: waiting for pending results... 25675 1727204004.75194: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727204004.75322: in run() - task 028d2410-947f-41bd-b19d-00000000003c 25675 1727204004.75350: variable 'ansible_search_path' from source: unknown 25675 1727204004.75358: variable 'ansible_search_path' from source: unknown 25675 1727204004.75403: calling self._execute() 25675 1727204004.75549: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204004.75568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204004.75590: variable 'omit' from source: magic vars 25675 1727204004.76058: variable 'ansible_distribution_major_version' from source: facts 25675 1727204004.76083: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204004.76094: _execute() done 25675 1727204004.76101: dumping result to json 25675 1727204004.76115: done dumping result, returning 25675 1727204004.76181: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-41bd-b19d-00000000003c] 25675 1727204004.76184: sending task result for task 028d2410-947f-41bd-b19d-00000000003c 25675 1727204004.76481: done sending task result for task 028d2410-947f-41bd-b19d-00000000003c 25675 1727204004.76484: WORKER PROCESS EXITING 25675 1727204004.76521: no more pending results, returning what we have 25675 1727204004.76526: in VariableManager get_vars() 25675 1727204004.76565: Calling all_inventory to load vars for managed-node2 25675 1727204004.76568: Calling groups_inventory to load vars for managed-node2 25675 1727204004.76571: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204004.76585: Calling all_plugins_play to load vars for managed-node2 25675 1727204004.76588: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204004.76592: Calling groups_plugins_play to load vars for managed-node2 25675 1727204004.78721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.87630: done with get_vars() 25675 1727204004.87655: variable 'ansible_search_path' from source: unknown 25675 1727204004.87657: variable 'ansible_search_path' from source: unknown 25675 1727204004.87690: we have included files to process 25675 1727204004.87692: generating all_blocks data 25675 1727204004.87693: done generating all_blocks data 25675 1727204004.87694: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204004.87695: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204004.87697: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204004.88803: done processing included file 25675 1727204004.88805: iterating over new_blocks loaded from include file 25675 1727204004.88807: in VariableManager get_vars() 25675 1727204004.88832: done with get_vars() 25675 1727204004.88834: filtering new block on tags 25675 1727204004.88850: done filtering new block on tags 25675 1727204004.88852: in VariableManager get_vars() 25675 1727204004.88870: done with get_vars() 25675 1727204004.88871: filtering new block on tags 25675 1727204004.88964: done filtering new block on tags 25675 1727204004.88968: in VariableManager get_vars() 25675 1727204004.88990: done with get_vars() 25675 1727204004.88992: filtering new block on tags 25675 1727204004.89007: done filtering new block on tags 25675 1727204004.89009: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 25675 1727204004.89014: extending task lists for all hosts with included blocks 25675 1727204004.89906: done extending task lists 25675 1727204004.89908: done processing included files 25675 1727204004.89909: results queue empty 25675 1727204004.89909: checking for any_errors_fatal 25675 1727204004.89911: done checking for any_errors_fatal 25675 1727204004.89912: checking for max_fail_percentage 25675 1727204004.89913: done checking for max_fail_percentage 25675 1727204004.89913: checking to see if all hosts have failed and the running result is not ok 25675 1727204004.89914: done checking to see if all hosts have failed 25675 1727204004.89915: getting the remaining hosts for this loop 25675 1727204004.89916: done getting the remaining hosts for this loop 25675 1727204004.89918: getting the next task for host managed-node2 25675 1727204004.89922: done getting next task for host managed-node2 25675 1727204004.89924: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727204004.89927: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204004.89936: getting variables 25675 1727204004.89937: in VariableManager get_vars() 25675 1727204004.89951: Calling all_inventory to load vars for managed-node2 25675 1727204004.89953: Calling groups_inventory to load vars for managed-node2 25675 1727204004.89955: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204004.89960: Calling all_plugins_play to load vars for managed-node2 25675 1727204004.89962: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204004.89965: Calling groups_plugins_play to load vars for managed-node2 25675 1727204004.91851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204004.94664: done with get_vars() 25675 1727204004.94695: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.203) 0:00:24.400 ***** 25675 1727204004.94888: entering _queue_task() for managed-node2/setup 25675 1727204004.95569: worker is 1 (out of 1 available) 25675 1727204004.95886: exiting _queue_task() for managed-node2/setup 25675 1727204004.95898: done queuing things up, now waiting for results queue to drain 25675 1727204004.95899: waiting for pending results... 25675 1727204004.96054: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727204004.96413: in run() - task 028d2410-947f-41bd-b19d-00000000036c 25675 1727204004.96441: variable 'ansible_search_path' from source: unknown 25675 1727204004.96535: variable 'ansible_search_path' from source: unknown 25675 1727204004.96552: calling self._execute() 25675 1727204004.96864: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204004.96867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204004.96870: variable 'omit' from source: magic vars 25675 1727204004.97631: variable 'ansible_distribution_major_version' from source: facts 25675 1727204004.97650: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204004.98114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204005.02857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204005.03341: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204005.03431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204005.03536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204005.03568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204005.03783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204005.03833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204005.03910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204005.04154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204005.04157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204005.04160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204005.04371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204005.04378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204005.04382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204005.04385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204005.04743: variable '__network_required_facts' from source: role '' defaults 25675 1727204005.04760: variable 'ansible_facts' from source: unknown 25675 1727204005.06020: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25675 1727204005.06029: when evaluation is False, skipping this task 25675 1727204005.06037: _execute() done 25675 1727204005.06045: dumping result to json 25675 1727204005.06053: done dumping result, returning 25675 1727204005.06066: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-41bd-b19d-00000000036c] 25675 1727204005.06080: sending task result for task 028d2410-947f-41bd-b19d-00000000036c 25675 1727204005.06205: done sending task result for task 028d2410-947f-41bd-b19d-00000000036c skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204005.06253: no more pending results, returning what we have 25675 1727204005.06257: results queue empty 25675 1727204005.06258: checking for any_errors_fatal 25675 1727204005.06260: done checking for any_errors_fatal 25675 1727204005.06261: checking for max_fail_percentage 25675 1727204005.06263: done checking for max_fail_percentage 25675 1727204005.06263: checking to see if all hosts have failed and the running result is not ok 25675 1727204005.06264: done checking to see if all hosts have failed 25675 1727204005.06265: getting the remaining hosts for this loop 25675 1727204005.06268: done getting the remaining hosts for this loop 25675 1727204005.06272: getting the next task for host managed-node2 25675 1727204005.06282: done getting next task for host managed-node2 25675 1727204005.06286: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727204005.06289: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204005.06304: getting variables 25675 1727204005.06306: in VariableManager get_vars() 25675 1727204005.06349: Calling all_inventory to load vars for managed-node2 25675 1727204005.06352: Calling groups_inventory to load vars for managed-node2 25675 1727204005.06355: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204005.06366: Calling all_plugins_play to load vars for managed-node2 25675 1727204005.06370: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204005.06373: Calling groups_plugins_play to load vars for managed-node2 25675 1727204005.07127: WORKER PROCESS EXITING 25675 1727204005.08247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204005.09888: done with get_vars() 25675 1727204005.09913: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.151) 0:00:24.551 ***** 25675 1727204005.10018: entering _queue_task() for managed-node2/stat 25675 1727204005.10384: worker is 1 (out of 1 available) 25675 1727204005.10508: exiting _queue_task() for managed-node2/stat 25675 1727204005.10519: done queuing things up, now waiting for results queue to drain 25675 1727204005.10520: waiting for pending results... 25675 1727204005.10705: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727204005.10841: in run() - task 028d2410-947f-41bd-b19d-00000000036e 25675 1727204005.10867: variable 'ansible_search_path' from source: unknown 25675 1727204005.10874: variable 'ansible_search_path' from source: unknown 25675 1727204005.10921: calling self._execute() 25675 1727204005.11024: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204005.11037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204005.11050: variable 'omit' from source: magic vars 25675 1727204005.11447: variable 'ansible_distribution_major_version' from source: facts 25675 1727204005.11464: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204005.11645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204005.11933: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204005.11989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204005.12034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204005.12078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204005.12220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204005.12271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204005.12296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204005.12343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204005.12429: variable '__network_is_ostree' from source: set_fact 25675 1727204005.12450: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727204005.12453: when evaluation is False, skipping this task 25675 1727204005.12484: _execute() done 25675 1727204005.12487: dumping result to json 25675 1727204005.12490: done dumping result, returning 25675 1727204005.12492: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-41bd-b19d-00000000036e] 25675 1727204005.12494: sending task result for task 028d2410-947f-41bd-b19d-00000000036e skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727204005.12729: no more pending results, returning what we have 25675 1727204005.12733: results queue empty 25675 1727204005.12734: checking for any_errors_fatal 25675 1727204005.12739: done checking for any_errors_fatal 25675 1727204005.12740: checking for max_fail_percentage 25675 1727204005.12741: done checking for max_fail_percentage 25675 1727204005.12742: checking to see if all hosts have failed and the running result is not ok 25675 1727204005.12743: done checking to see if all hosts have failed 25675 1727204005.12744: getting the remaining hosts for this loop 25675 1727204005.12746: done getting the remaining hosts for this loop 25675 1727204005.12751: getting the next task for host managed-node2 25675 1727204005.12757: done getting next task for host managed-node2 25675 1727204005.12761: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727204005.12764: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204005.12781: getting variables 25675 1727204005.12783: in VariableManager get_vars() 25675 1727204005.12824: Calling all_inventory to load vars for managed-node2 25675 1727204005.12826: Calling groups_inventory to load vars for managed-node2 25675 1727204005.12829: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204005.12840: Calling all_plugins_play to load vars for managed-node2 25675 1727204005.12843: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204005.12846: Calling groups_plugins_play to load vars for managed-node2 25675 1727204005.13589: done sending task result for task 028d2410-947f-41bd-b19d-00000000036e 25675 1727204005.13592: WORKER PROCESS EXITING 25675 1727204005.14461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204005.16120: done with get_vars() 25675 1727204005.16147: done getting variables 25675 1727204005.16209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.062) 0:00:24.613 ***** 25675 1727204005.16249: entering _queue_task() for managed-node2/set_fact 25675 1727204005.16608: worker is 1 (out of 1 available) 25675 1727204005.16618: exiting _queue_task() for managed-node2/set_fact 25675 1727204005.16630: done queuing things up, now waiting for results queue to drain 25675 1727204005.16631: waiting for pending results... 25675 1727204005.16917: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727204005.17055: in run() - task 028d2410-947f-41bd-b19d-00000000036f 25675 1727204005.17079: variable 'ansible_search_path' from source: unknown 25675 1727204005.17088: variable 'ansible_search_path' from source: unknown 25675 1727204005.17132: calling self._execute() 25675 1727204005.17230: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204005.17242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204005.17255: variable 'omit' from source: magic vars 25675 1727204005.17625: variable 'ansible_distribution_major_version' from source: facts 25675 1727204005.17647: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204005.17819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204005.18097: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204005.18147: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204005.18190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204005.18226: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204005.18363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204005.18400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204005.18431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204005.18462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204005.18561: variable '__network_is_ostree' from source: set_fact 25675 1727204005.18573: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727204005.18583: when evaluation is False, skipping this task 25675 1727204005.18592: _execute() done 25675 1727204005.18599: dumping result to json 25675 1727204005.18608: done dumping result, returning 25675 1727204005.18681: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-41bd-b19d-00000000036f] 25675 1727204005.18685: sending task result for task 028d2410-947f-41bd-b19d-00000000036f skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727204005.18882: no more pending results, returning what we have 25675 1727204005.18887: results queue empty 25675 1727204005.18888: checking for any_errors_fatal 25675 1727204005.18896: done checking for any_errors_fatal 25675 1727204005.18897: checking for max_fail_percentage 25675 1727204005.18898: done checking for max_fail_percentage 25675 1727204005.18900: checking to see if all hosts have failed and the running result is not ok 25675 1727204005.18900: done checking to see if all hosts have failed 25675 1727204005.18901: getting the remaining hosts for this loop 25675 1727204005.18903: done getting the remaining hosts for this loop 25675 1727204005.18907: getting the next task for host managed-node2 25675 1727204005.18916: done getting next task for host managed-node2 25675 1727204005.18920: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727204005.18923: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204005.18939: getting variables 25675 1727204005.18942: in VariableManager get_vars() 25675 1727204005.18985: Calling all_inventory to load vars for managed-node2 25675 1727204005.18988: Calling groups_inventory to load vars for managed-node2 25675 1727204005.18991: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204005.19002: Calling all_plugins_play to load vars for managed-node2 25675 1727204005.19005: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204005.19009: Calling groups_plugins_play to load vars for managed-node2 25675 1727204005.19589: done sending task result for task 028d2410-947f-41bd-b19d-00000000036f 25675 1727204005.19594: WORKER PROCESS EXITING 25675 1727204005.20785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204005.22428: done with get_vars() 25675 1727204005.22462: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.063) 0:00:24.676 ***** 25675 1727204005.22564: entering _queue_task() for managed-node2/service_facts 25675 1727204005.22932: worker is 1 (out of 1 available) 25675 1727204005.22945: exiting _queue_task() for managed-node2/service_facts 25675 1727204005.22958: done queuing things up, now waiting for results queue to drain 25675 1727204005.22959: waiting for pending results... 25675 1727204005.23247: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727204005.23391: in run() - task 028d2410-947f-41bd-b19d-000000000371 25675 1727204005.23422: variable 'ansible_search_path' from source: unknown 25675 1727204005.23430: variable 'ansible_search_path' from source: unknown 25675 1727204005.23470: calling self._execute() 25675 1727204005.23571: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204005.23587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204005.23603: variable 'omit' from source: magic vars 25675 1727204005.23997: variable 'ansible_distribution_major_version' from source: facts 25675 1727204005.24053: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204005.24057: variable 'omit' from source: magic vars 25675 1727204005.24099: variable 'omit' from source: magic vars 25675 1727204005.24137: variable 'omit' from source: magic vars 25675 1727204005.24190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204005.24229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204005.24253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204005.24380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204005.24383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204005.24386: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204005.24388: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204005.24390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204005.24451: Set connection var ansible_shell_type to sh 25675 1727204005.24461: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204005.24471: Set connection var ansible_timeout to 10 25675 1727204005.24489: Set connection var ansible_pipelining to False 25675 1727204005.24499: Set connection var ansible_shell_executable to /bin/sh 25675 1727204005.24506: Set connection var ansible_connection to ssh 25675 1727204005.24535: variable 'ansible_shell_executable' from source: unknown 25675 1727204005.24542: variable 'ansible_connection' from source: unknown 25675 1727204005.24548: variable 'ansible_module_compression' from source: unknown 25675 1727204005.24554: variable 'ansible_shell_type' from source: unknown 25675 1727204005.24559: variable 'ansible_shell_executable' from source: unknown 25675 1727204005.24565: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204005.24572: variable 'ansible_pipelining' from source: unknown 25675 1727204005.24581: variable 'ansible_timeout' from source: unknown 25675 1727204005.24595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204005.24798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204005.24822: variable 'omit' from source: magic vars 25675 1727204005.24831: starting attempt loop 25675 1727204005.24881: running the handler 25675 1727204005.24885: _low_level_execute_command(): starting 25675 1727204005.24887: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204005.25705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204005.25723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204005.25742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204005.25919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204005.27544: stdout chunk (state=3): >>>/root <<< 25675 1727204005.27682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204005.27686: stdout chunk (state=3): >>><<< 25675 1727204005.27692: stderr chunk (state=3): >>><<< 25675 1727204005.27713: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204005.27729: _low_level_execute_command(): starting 25675 1727204005.27786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380 `" && echo ansible-tmp-1727204005.2771451-27880-224465174816380="` echo /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380 `" ) && sleep 0' 25675 1727204005.28972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204005.29140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204005.29190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204005.29241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204005.31244: stdout chunk (state=3): >>>ansible-tmp-1727204005.2771451-27880-224465174816380=/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380 <<< 25675 1727204005.31381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204005.31385: stderr chunk (state=3): >>><<< 25675 1727204005.31387: stdout chunk (state=3): >>><<< 25675 1727204005.31465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204005.2771451-27880-224465174816380=/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204005.31482: variable 'ansible_module_compression' from source: unknown 25675 1727204005.31533: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 25675 1727204005.31595: variable 'ansible_facts' from source: unknown 25675 1727204005.31695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py 25675 1727204005.31928: Sending initial data 25675 1727204005.31931: Sent initial data (162 bytes) 25675 1727204005.32561: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204005.32630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204005.32649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204005.32679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204005.32803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204005.34425: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204005.34507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204005.34592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpsi9m_suw /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py <<< 25675 1727204005.34616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py" <<< 25675 1727204005.34699: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpsi9m_suw" to remote "/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py" <<< 25675 1727204005.35890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204005.35894: stdout chunk (state=3): >>><<< 25675 1727204005.35897: stderr chunk (state=3): >>><<< 25675 1727204005.35899: done transferring module to remote 25675 1727204005.35901: _low_level_execute_command(): starting 25675 1727204005.35904: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/ /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py && sleep 0' 25675 1727204005.36498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204005.36511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204005.36529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204005.36556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204005.36571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204005.36646: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204005.36689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204005.36719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204005.36813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204005.38832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204005.38836: stdout chunk (state=3): >>><<< 25675 1727204005.38839: stderr chunk (state=3): >>><<< 25675 1727204005.38983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204005.38987: _low_level_execute_command(): starting 25675 1727204005.38991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/AnsiballZ_service_facts.py && sleep 0' 25675 1727204005.39956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204005.39960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204005.39963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204005.39965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204005.39967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204005.40022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204005.40130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204006.94121: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {<<< 25675 1727204006.94179: stdout chunk (state=3): >>>"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25675 1727204006.96182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204006.96186: stdout chunk (state=3): >>><<< 25675 1727204006.96188: stderr chunk (state=3): >>><<< 25675 1727204006.96193: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204006.97001: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204006.97017: _low_level_execute_command(): starting 25675 1727204006.97028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204005.2771451-27880-224465174816380/ > /dev/null 2>&1 && sleep 0' 25675 1727204006.97691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204006.97719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204006.97798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204006.99738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204006.99781: stderr chunk (state=3): >>><<< 25675 1727204006.99793: stdout chunk (state=3): >>><<< 25675 1727204006.99811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204006.99823: handler run complete 25675 1727204007.00003: variable 'ansible_facts' from source: unknown 25675 1727204007.00133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.00560: variable 'ansible_facts' from source: unknown 25675 1727204007.00670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.00807: attempt loop complete, returning result 25675 1727204007.00811: _execute() done 25675 1727204007.00814: dumping result to json 25675 1727204007.00880: done dumping result, returning 25675 1727204007.00884: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-41bd-b19d-000000000371] 25675 1727204007.00919: sending task result for task 028d2410-947f-41bd-b19d-000000000371 25675 1727204007.02887: done sending task result for task 028d2410-947f-41bd-b19d-000000000371 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204007.02982: no more pending results, returning what we have 25675 1727204007.02985: results queue empty 25675 1727204007.02986: checking for any_errors_fatal 25675 1727204007.02990: done checking for any_errors_fatal 25675 1727204007.02991: checking for max_fail_percentage 25675 1727204007.02992: done checking for max_fail_percentage 25675 1727204007.02993: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.02994: done checking to see if all hosts have failed 25675 1727204007.02994: getting the remaining hosts for this loop 25675 1727204007.02996: done getting the remaining hosts for this loop 25675 1727204007.02999: getting the next task for host managed-node2 25675 1727204007.03004: done getting next task for host managed-node2 25675 1727204007.03007: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727204007.03009: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.03019: getting variables 25675 1727204007.03021: in VariableManager get_vars() 25675 1727204007.03049: Calling all_inventory to load vars for managed-node2 25675 1727204007.03052: Calling groups_inventory to load vars for managed-node2 25675 1727204007.03053: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.03061: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.03063: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.03065: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.03593: WORKER PROCESS EXITING 25675 1727204007.03935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.04832: done with get_vars() 25675 1727204007.04859: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:27 -0400 (0:00:01.823) 0:00:26.500 ***** 25675 1727204007.04947: entering _queue_task() for managed-node2/package_facts 25675 1727204007.05232: worker is 1 (out of 1 available) 25675 1727204007.05245: exiting _queue_task() for managed-node2/package_facts 25675 1727204007.05257: done queuing things up, now waiting for results queue to drain 25675 1727204007.05258: waiting for pending results... 25675 1727204007.05460: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727204007.05561: in run() - task 028d2410-947f-41bd-b19d-000000000372 25675 1727204007.05574: variable 'ansible_search_path' from source: unknown 25675 1727204007.05579: variable 'ansible_search_path' from source: unknown 25675 1727204007.05608: calling self._execute() 25675 1727204007.05679: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.05687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.05697: variable 'omit' from source: magic vars 25675 1727204007.06005: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.06015: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.06021: variable 'omit' from source: magic vars 25675 1727204007.06062: variable 'omit' from source: magic vars 25675 1727204007.06093: variable 'omit' from source: magic vars 25675 1727204007.06123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204007.06151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204007.06166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204007.06186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204007.06195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204007.06218: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204007.06221: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.06223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.06355: Set connection var ansible_shell_type to sh 25675 1727204007.06359: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204007.06362: Set connection var ansible_timeout to 10 25675 1727204007.06364: Set connection var ansible_pipelining to False 25675 1727204007.06367: Set connection var ansible_shell_executable to /bin/sh 25675 1727204007.06369: Set connection var ansible_connection to ssh 25675 1727204007.06371: variable 'ansible_shell_executable' from source: unknown 25675 1727204007.06373: variable 'ansible_connection' from source: unknown 25675 1727204007.06377: variable 'ansible_module_compression' from source: unknown 25675 1727204007.06380: variable 'ansible_shell_type' from source: unknown 25675 1727204007.06381: variable 'ansible_shell_executable' from source: unknown 25675 1727204007.06384: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.06386: variable 'ansible_pipelining' from source: unknown 25675 1727204007.06388: variable 'ansible_timeout' from source: unknown 25675 1727204007.06390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.06488: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204007.06497: variable 'omit' from source: magic vars 25675 1727204007.06504: starting attempt loop 25675 1727204007.06507: running the handler 25675 1727204007.06520: _low_level_execute_command(): starting 25675 1727204007.06527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204007.07051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.07055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.07057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727204007.07064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.07122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204007.07129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204007.07132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.07209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.08922: stdout chunk (state=3): >>>/root <<< 25675 1727204007.09026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204007.09073: stderr chunk (state=3): >>><<< 25675 1727204007.09091: stdout chunk (state=3): >>><<< 25675 1727204007.09095: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204007.09115: _low_level_execute_command(): starting 25675 1727204007.09118: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971 `" && echo ansible-tmp-1727204007.0909557-27968-77494219347971="` echo /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971 `" ) && sleep 0' 25675 1727204007.09658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.09661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204007.09664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.09677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204007.09680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.09738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204007.09742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.09832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.11802: stdout chunk (state=3): >>>ansible-tmp-1727204007.0909557-27968-77494219347971=/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971 <<< 25675 1727204007.11946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204007.11971: stderr chunk (state=3): >>><<< 25675 1727204007.11987: stdout chunk (state=3): >>><<< 25675 1727204007.12004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204007.0909557-27968-77494219347971=/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204007.12052: variable 'ansible_module_compression' from source: unknown 25675 1727204007.12118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 25675 1727204007.12161: variable 'ansible_facts' from source: unknown 25675 1727204007.12289: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py 25675 1727204007.12401: Sending initial data 25675 1727204007.12404: Sent initial data (161 bytes) 25675 1727204007.12894: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.12898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204007.12900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.12903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204007.12905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.12967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204007.12971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204007.13020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.13110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.14739: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204007.14816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204007.14901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp55v9ue6f /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py <<< 25675 1727204007.14904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py" <<< 25675 1727204007.14978: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp55v9ue6f" to remote "/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py" <<< 25675 1727204007.16244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204007.16289: stderr chunk (state=3): >>><<< 25675 1727204007.16292: stdout chunk (state=3): >>><<< 25675 1727204007.16314: done transferring module to remote 25675 1727204007.16324: _low_level_execute_command(): starting 25675 1727204007.16328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/ /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py && sleep 0' 25675 1727204007.16780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.16784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204007.16786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.16789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204007.16791: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204007.16797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.16849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204007.16855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204007.16857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.16924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.18806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204007.18827: stderr chunk (state=3): >>><<< 25675 1727204007.18830: stdout chunk (state=3): >>><<< 25675 1727204007.18842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204007.18845: _low_level_execute_command(): starting 25675 1727204007.18851: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/AnsiballZ_package_facts.py && sleep 0' 25675 1727204007.19365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.19368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.19371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.19373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727204007.19377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204007.19433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204007.19437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.19512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.64299: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 25675 1727204007.64384: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 25675 1727204007.64464: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25675 1727204007.66207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204007.66238: stderr chunk (state=3): >>><<< 25675 1727204007.66241: stdout chunk (state=3): >>><<< 25675 1727204007.66278: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204007.67883: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204007.67887: _low_level_execute_command(): starting 25675 1727204007.67891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204007.0909557-27968-77494219347971/ > /dev/null 2>&1 && sleep 0' 25675 1727204007.68395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204007.68410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204007.68427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204007.68446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204007.68463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204007.68472: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204007.68559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204007.68584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204007.68600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204007.68707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204007.70572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204007.70606: stderr chunk (state=3): >>><<< 25675 1727204007.70609: stdout chunk (state=3): >>><<< 25675 1727204007.70626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204007.70632: handler run complete 25675 1727204007.71108: variable 'ansible_facts' from source: unknown 25675 1727204007.71380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.72982: variable 'ansible_facts' from source: unknown 25675 1727204007.73201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.73587: attempt loop complete, returning result 25675 1727204007.73602: _execute() done 25675 1727204007.73605: dumping result to json 25675 1727204007.73725: done dumping result, returning 25675 1727204007.73733: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-41bd-b19d-000000000372] 25675 1727204007.73738: sending task result for task 028d2410-947f-41bd-b19d-000000000372 25675 1727204007.75004: done sending task result for task 028d2410-947f-41bd-b19d-000000000372 25675 1727204007.75008: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204007.75095: no more pending results, returning what we have 25675 1727204007.75097: results queue empty 25675 1727204007.75097: checking for any_errors_fatal 25675 1727204007.75102: done checking for any_errors_fatal 25675 1727204007.75102: checking for max_fail_percentage 25675 1727204007.75103: done checking for max_fail_percentage 25675 1727204007.75104: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.75105: done checking to see if all hosts have failed 25675 1727204007.75105: getting the remaining hosts for this loop 25675 1727204007.75106: done getting the remaining hosts for this loop 25675 1727204007.75108: getting the next task for host managed-node2 25675 1727204007.75113: done getting next task for host managed-node2 25675 1727204007.75115: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25675 1727204007.75117: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.75123: getting variables 25675 1727204007.75124: in VariableManager get_vars() 25675 1727204007.75148: Calling all_inventory to load vars for managed-node2 25675 1727204007.75150: Calling groups_inventory to load vars for managed-node2 25675 1727204007.75151: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.75158: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.75159: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.75161: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.75882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.76897: done with get_vars() 25675 1727204007.76928: done getting variables 25675 1727204007.77001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:27 -0400 (0:00:00.720) 0:00:27.221 ***** 25675 1727204007.77028: entering _queue_task() for managed-node2/debug 25675 1727204007.77394: worker is 1 (out of 1 available) 25675 1727204007.77407: exiting _queue_task() for managed-node2/debug 25675 1727204007.77420: done queuing things up, now waiting for results queue to drain 25675 1727204007.77422: waiting for pending results... 25675 1727204007.77622: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 25675 1727204007.77781: in run() - task 028d2410-947f-41bd-b19d-00000000003d 25675 1727204007.77790: variable 'ansible_search_path' from source: unknown 25675 1727204007.77793: variable 'ansible_search_path' from source: unknown 25675 1727204007.77823: calling self._execute() 25675 1727204007.77925: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.77937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.77950: variable 'omit' from source: magic vars 25675 1727204007.78333: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.78343: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.78349: variable 'omit' from source: magic vars 25675 1727204007.78386: variable 'omit' from source: magic vars 25675 1727204007.78453: variable 'network_provider' from source: set_fact 25675 1727204007.78467: variable 'omit' from source: magic vars 25675 1727204007.78507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204007.78535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204007.78552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204007.78567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204007.78613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204007.78632: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204007.78636: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.78639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.78717: Set connection var ansible_shell_type to sh 25675 1727204007.78720: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204007.78723: Set connection var ansible_timeout to 10 25675 1727204007.78780: Set connection var ansible_pipelining to False 25675 1727204007.78783: Set connection var ansible_shell_executable to /bin/sh 25675 1727204007.78800: Set connection var ansible_connection to ssh 25675 1727204007.78804: variable 'ansible_shell_executable' from source: unknown 25675 1727204007.78807: variable 'ansible_connection' from source: unknown 25675 1727204007.78810: variable 'ansible_module_compression' from source: unknown 25675 1727204007.78812: variable 'ansible_shell_type' from source: unknown 25675 1727204007.78818: variable 'ansible_shell_executable' from source: unknown 25675 1727204007.78822: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.78824: variable 'ansible_pipelining' from source: unknown 25675 1727204007.78827: variable 'ansible_timeout' from source: unknown 25675 1727204007.78829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.78998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204007.79038: variable 'omit' from source: magic vars 25675 1727204007.79041: starting attempt loop 25675 1727204007.79044: running the handler 25675 1727204007.79064: handler run complete 25675 1727204007.79085: attempt loop complete, returning result 25675 1727204007.79088: _execute() done 25675 1727204007.79091: dumping result to json 25675 1727204007.79094: done dumping result, returning 25675 1727204007.79100: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-41bd-b19d-00000000003d] 25675 1727204007.79105: sending task result for task 028d2410-947f-41bd-b19d-00000000003d 25675 1727204007.79221: done sending task result for task 028d2410-947f-41bd-b19d-00000000003d 25675 1727204007.79224: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 25675 1727204007.79358: no more pending results, returning what we have 25675 1727204007.79361: results queue empty 25675 1727204007.79362: checking for any_errors_fatal 25675 1727204007.79371: done checking for any_errors_fatal 25675 1727204007.79371: checking for max_fail_percentage 25675 1727204007.79373: done checking for max_fail_percentage 25675 1727204007.79377: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.79378: done checking to see if all hosts have failed 25675 1727204007.79379: getting the remaining hosts for this loop 25675 1727204007.79380: done getting the remaining hosts for this loop 25675 1727204007.79384: getting the next task for host managed-node2 25675 1727204007.79390: done getting next task for host managed-node2 25675 1727204007.79427: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727204007.79429: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.79440: getting variables 25675 1727204007.79442: in VariableManager get_vars() 25675 1727204007.79478: Calling all_inventory to load vars for managed-node2 25675 1727204007.79480: Calling groups_inventory to load vars for managed-node2 25675 1727204007.79483: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.79491: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.79494: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.79496: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.80813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.82165: done with get_vars() 25675 1727204007.82203: done getting variables 25675 1727204007.82248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:27 -0400 (0:00:00.052) 0:00:27.273 ***** 25675 1727204007.82272: entering _queue_task() for managed-node2/fail 25675 1727204007.82613: worker is 1 (out of 1 available) 25675 1727204007.82630: exiting _queue_task() for managed-node2/fail 25675 1727204007.82645: done queuing things up, now waiting for results queue to drain 25675 1727204007.82646: waiting for pending results... 25675 1727204007.82865: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727204007.82970: in run() - task 028d2410-947f-41bd-b19d-00000000003e 25675 1727204007.83030: variable 'ansible_search_path' from source: unknown 25675 1727204007.83037: variable 'ansible_search_path' from source: unknown 25675 1727204007.83043: calling self._execute() 25675 1727204007.83133: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.83137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.83141: variable 'omit' from source: magic vars 25675 1727204007.83543: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.83549: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.83708: variable 'network_state' from source: role '' defaults 25675 1727204007.83713: Evaluated conditional (network_state != {}): False 25675 1727204007.83716: when evaluation is False, skipping this task 25675 1727204007.83718: _execute() done 25675 1727204007.83721: dumping result to json 25675 1727204007.83724: done dumping result, returning 25675 1727204007.83727: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-41bd-b19d-00000000003e] 25675 1727204007.83729: sending task result for task 028d2410-947f-41bd-b19d-00000000003e 25675 1727204007.83822: done sending task result for task 028d2410-947f-41bd-b19d-00000000003e 25675 1727204007.83825: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204007.83886: no more pending results, returning what we have 25675 1727204007.83889: results queue empty 25675 1727204007.83890: checking for any_errors_fatal 25675 1727204007.83901: done checking for any_errors_fatal 25675 1727204007.83902: checking for max_fail_percentage 25675 1727204007.83904: done checking for max_fail_percentage 25675 1727204007.83906: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.83907: done checking to see if all hosts have failed 25675 1727204007.83908: getting the remaining hosts for this loop 25675 1727204007.83909: done getting the remaining hosts for this loop 25675 1727204007.83913: getting the next task for host managed-node2 25675 1727204007.83920: done getting next task for host managed-node2 25675 1727204007.83923: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727204007.83925: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.83943: getting variables 25675 1727204007.83944: in VariableManager get_vars() 25675 1727204007.83980: Calling all_inventory to load vars for managed-node2 25675 1727204007.83983: Calling groups_inventory to load vars for managed-node2 25675 1727204007.83985: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.83994: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.83996: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.83999: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.85037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.86152: done with get_vars() 25675 1727204007.86173: done getting variables 25675 1727204007.86220: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:27 -0400 (0:00:00.039) 0:00:27.313 ***** 25675 1727204007.86250: entering _queue_task() for managed-node2/fail 25675 1727204007.86523: worker is 1 (out of 1 available) 25675 1727204007.86537: exiting _queue_task() for managed-node2/fail 25675 1727204007.86548: done queuing things up, now waiting for results queue to drain 25675 1727204007.86550: waiting for pending results... 25675 1727204007.86723: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727204007.86805: in run() - task 028d2410-947f-41bd-b19d-00000000003f 25675 1727204007.86815: variable 'ansible_search_path' from source: unknown 25675 1727204007.86819: variable 'ansible_search_path' from source: unknown 25675 1727204007.86846: calling self._execute() 25675 1727204007.86924: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.86927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.86937: variable 'omit' from source: magic vars 25675 1727204007.87221: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.87230: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.87315: variable 'network_state' from source: role '' defaults 25675 1727204007.87331: Evaluated conditional (network_state != {}): False 25675 1727204007.87334: when evaluation is False, skipping this task 25675 1727204007.87337: _execute() done 25675 1727204007.87340: dumping result to json 25675 1727204007.87343: done dumping result, returning 25675 1727204007.87346: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-41bd-b19d-00000000003f] 25675 1727204007.87349: sending task result for task 028d2410-947f-41bd-b19d-00000000003f 25675 1727204007.87440: done sending task result for task 028d2410-947f-41bd-b19d-00000000003f 25675 1727204007.87443: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204007.87490: no more pending results, returning what we have 25675 1727204007.87494: results queue empty 25675 1727204007.87495: checking for any_errors_fatal 25675 1727204007.87501: done checking for any_errors_fatal 25675 1727204007.87501: checking for max_fail_percentage 25675 1727204007.87503: done checking for max_fail_percentage 25675 1727204007.87504: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.87505: done checking to see if all hosts have failed 25675 1727204007.87506: getting the remaining hosts for this loop 25675 1727204007.87507: done getting the remaining hosts for this loop 25675 1727204007.87511: getting the next task for host managed-node2 25675 1727204007.87517: done getting next task for host managed-node2 25675 1727204007.87520: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727204007.87522: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.87539: getting variables 25675 1727204007.87541: in VariableManager get_vars() 25675 1727204007.87579: Calling all_inventory to load vars for managed-node2 25675 1727204007.87583: Calling groups_inventory to load vars for managed-node2 25675 1727204007.87585: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.87594: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.87596: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.87599: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.88450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.89456: done with get_vars() 25675 1727204007.89477: done getting variables 25675 1727204007.89523: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:27 -0400 (0:00:00.032) 0:00:27.346 ***** 25675 1727204007.89547: entering _queue_task() for managed-node2/fail 25675 1727204007.89815: worker is 1 (out of 1 available) 25675 1727204007.89829: exiting _queue_task() for managed-node2/fail 25675 1727204007.89840: done queuing things up, now waiting for results queue to drain 25675 1727204007.89841: waiting for pending results... 25675 1727204007.90133: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727204007.90255: in run() - task 028d2410-947f-41bd-b19d-000000000040 25675 1727204007.90259: variable 'ansible_search_path' from source: unknown 25675 1727204007.90262: variable 'ansible_search_path' from source: unknown 25675 1727204007.90266: calling self._execute() 25675 1727204007.90345: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.90362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.90379: variable 'omit' from source: magic vars 25675 1727204007.90806: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.90811: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.90984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204007.93073: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204007.93155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204007.93197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204007.93225: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204007.93257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204007.93340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204007.93361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204007.93384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204007.93427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204007.93442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204007.93515: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.93527: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25675 1727204007.93636: variable 'ansible_distribution' from source: facts 25675 1727204007.93639: variable '__network_rh_distros' from source: role '' defaults 25675 1727204007.93641: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25675 1727204007.93920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204007.93927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204007.93931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204007.93970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204007.94007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204007.94040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204007.94074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204007.94092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204007.94134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204007.94157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204007.94242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204007.94246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204007.94380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204007.94383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204007.94385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204007.94685: variable 'network_connections' from source: play vars 25675 1727204007.94701: variable 'profile' from source: play vars 25675 1727204007.94772: variable 'profile' from source: play vars 25675 1727204007.94786: variable 'interface' from source: set_fact 25675 1727204007.94849: variable 'interface' from source: set_fact 25675 1727204007.94870: variable 'network_state' from source: role '' defaults 25675 1727204007.94961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204007.95149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204007.95198: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204007.95233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204007.95267: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204007.95316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204007.95350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204007.95381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204007.95413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204007.95452: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25675 1727204007.95461: when evaluation is False, skipping this task 25675 1727204007.95471: _execute() done 25675 1727204007.95481: dumping result to json 25675 1727204007.95580: done dumping result, returning 25675 1727204007.95584: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-41bd-b19d-000000000040] 25675 1727204007.95587: sending task result for task 028d2410-947f-41bd-b19d-000000000040 skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25675 1727204007.95716: no more pending results, returning what we have 25675 1727204007.95719: results queue empty 25675 1727204007.95720: checking for any_errors_fatal 25675 1727204007.95728: done checking for any_errors_fatal 25675 1727204007.95729: checking for max_fail_percentage 25675 1727204007.95731: done checking for max_fail_percentage 25675 1727204007.95731: checking to see if all hosts have failed and the running result is not ok 25675 1727204007.95732: done checking to see if all hosts have failed 25675 1727204007.95733: getting the remaining hosts for this loop 25675 1727204007.95734: done getting the remaining hosts for this loop 25675 1727204007.95738: getting the next task for host managed-node2 25675 1727204007.95744: done getting next task for host managed-node2 25675 1727204007.95748: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727204007.95749: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204007.95762: getting variables 25675 1727204007.95764: in VariableManager get_vars() 25675 1727204007.95801: Calling all_inventory to load vars for managed-node2 25675 1727204007.95803: Calling groups_inventory to load vars for managed-node2 25675 1727204007.95805: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204007.95815: Calling all_plugins_play to load vars for managed-node2 25675 1727204007.95818: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204007.95820: Calling groups_plugins_play to load vars for managed-node2 25675 1727204007.96389: done sending task result for task 028d2410-947f-41bd-b19d-000000000040 25675 1727204007.96392: WORKER PROCESS EXITING 25675 1727204007.96730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204007.97898: done with get_vars() 25675 1727204007.97924: done getting variables 25675 1727204007.98000: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:27 -0400 (0:00:00.084) 0:00:27.431 ***** 25675 1727204007.98041: entering _queue_task() for managed-node2/dnf 25675 1727204007.98515: worker is 1 (out of 1 available) 25675 1727204007.98530: exiting _queue_task() for managed-node2/dnf 25675 1727204007.98542: done queuing things up, now waiting for results queue to drain 25675 1727204007.98543: waiting for pending results... 25675 1727204007.98905: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727204007.98970: in run() - task 028d2410-947f-41bd-b19d-000000000041 25675 1727204007.98994: variable 'ansible_search_path' from source: unknown 25675 1727204007.99181: variable 'ansible_search_path' from source: unknown 25675 1727204007.99185: calling self._execute() 25675 1727204007.99187: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204007.99190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204007.99192: variable 'omit' from source: magic vars 25675 1727204007.99561: variable 'ansible_distribution_major_version' from source: facts 25675 1727204007.99581: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204007.99790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.02187: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.02282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.02327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.02380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.02420: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.02510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.02557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.02593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.02643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.02680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.02831: variable 'ansible_distribution' from source: facts 25675 1727204008.02841: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.02862: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25675 1727204008.02999: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.03141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.03207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.03209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.03239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.03266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.03323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.03358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.03424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.03445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.03468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.03519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.03559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.03642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.03660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.03685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.03898: variable 'network_connections' from source: play vars 25675 1727204008.03918: variable 'profile' from source: play vars 25675 1727204008.04012: variable 'profile' from source: play vars 25675 1727204008.04028: variable 'interface' from source: set_fact 25675 1727204008.04197: variable 'interface' from source: set_fact 25675 1727204008.04219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204008.04866: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204008.04921: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204008.04973: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204008.05019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204008.05069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204008.05104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204008.05143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.05380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204008.05384: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204008.05558: variable 'network_connections' from source: play vars 25675 1727204008.05569: variable 'profile' from source: play vars 25675 1727204008.05640: variable 'profile' from source: play vars 25675 1727204008.05649: variable 'interface' from source: set_fact 25675 1727204008.05716: variable 'interface' from source: set_fact 25675 1727204008.05764: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204008.05798: when evaluation is False, skipping this task 25675 1727204008.05801: _execute() done 25675 1727204008.05803: dumping result to json 25675 1727204008.05806: done dumping result, returning 25675 1727204008.05817: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000041] 25675 1727204008.05981: sending task result for task 028d2410-947f-41bd-b19d-000000000041 25675 1727204008.06052: done sending task result for task 028d2410-947f-41bd-b19d-000000000041 25675 1727204008.06055: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204008.06118: no more pending results, returning what we have 25675 1727204008.06123: results queue empty 25675 1727204008.06124: checking for any_errors_fatal 25675 1727204008.06129: done checking for any_errors_fatal 25675 1727204008.06130: checking for max_fail_percentage 25675 1727204008.06131: done checking for max_fail_percentage 25675 1727204008.06132: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.06133: done checking to see if all hosts have failed 25675 1727204008.06134: getting the remaining hosts for this loop 25675 1727204008.06136: done getting the remaining hosts for this loop 25675 1727204008.06140: getting the next task for host managed-node2 25675 1727204008.06146: done getting next task for host managed-node2 25675 1727204008.06150: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727204008.06152: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.06166: getting variables 25675 1727204008.06168: in VariableManager get_vars() 25675 1727204008.06211: Calling all_inventory to load vars for managed-node2 25675 1727204008.06214: Calling groups_inventory to load vars for managed-node2 25675 1727204008.06217: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.06228: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.06232: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.06235: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.08517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.10300: done with get_vars() 25675 1727204008.10330: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727204008.10419: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.124) 0:00:27.555 ***** 25675 1727204008.10450: entering _queue_task() for managed-node2/yum 25675 1727204008.11096: worker is 1 (out of 1 available) 25675 1727204008.11106: exiting _queue_task() for managed-node2/yum 25675 1727204008.11115: done queuing things up, now waiting for results queue to drain 25675 1727204008.11116: waiting for pending results... 25675 1727204008.11583: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727204008.11591: in run() - task 028d2410-947f-41bd-b19d-000000000042 25675 1727204008.11595: variable 'ansible_search_path' from source: unknown 25675 1727204008.11598: variable 'ansible_search_path' from source: unknown 25675 1727204008.11601: calling self._execute() 25675 1727204008.11666: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.11720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.11724: variable 'omit' from source: magic vars 25675 1727204008.12126: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.12135: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.12264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.14147: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.14195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.14222: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.14250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.14270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.14332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.14356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.14374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.14403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.14415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.14491: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.14503: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25675 1727204008.14506: when evaluation is False, skipping this task 25675 1727204008.14509: _execute() done 25675 1727204008.14511: dumping result to json 25675 1727204008.14516: done dumping result, returning 25675 1727204008.14524: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000042] 25675 1727204008.14528: sending task result for task 028d2410-947f-41bd-b19d-000000000042 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25675 1727204008.14668: no more pending results, returning what we have 25675 1727204008.14672: results queue empty 25675 1727204008.14672: checking for any_errors_fatal 25675 1727204008.14681: done checking for any_errors_fatal 25675 1727204008.14682: checking for max_fail_percentage 25675 1727204008.14684: done checking for max_fail_percentage 25675 1727204008.14686: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.14687: done checking to see if all hosts have failed 25675 1727204008.14687: getting the remaining hosts for this loop 25675 1727204008.14688: done getting the remaining hosts for this loop 25675 1727204008.14692: getting the next task for host managed-node2 25675 1727204008.14699: done getting next task for host managed-node2 25675 1727204008.14702: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727204008.14704: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.14718: getting variables 25675 1727204008.14719: in VariableManager get_vars() 25675 1727204008.14755: Calling all_inventory to load vars for managed-node2 25675 1727204008.14757: Calling groups_inventory to load vars for managed-node2 25675 1727204008.14759: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.14769: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.14772: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.14774: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.14794: done sending task result for task 028d2410-947f-41bd-b19d-000000000042 25675 1727204008.14799: WORKER PROCESS EXITING 25675 1727204008.17639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.19677: done with get_vars() 25675 1727204008.19707: done getting variables 25675 1727204008.19927: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.097) 0:00:27.653 ***** 25675 1727204008.20212: entering _queue_task() for managed-node2/fail 25675 1727204008.20854: worker is 1 (out of 1 available) 25675 1727204008.20866: exiting _queue_task() for managed-node2/fail 25675 1727204008.20879: done queuing things up, now waiting for results queue to drain 25675 1727204008.20881: waiting for pending results... 25675 1727204008.21296: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727204008.21436: in run() - task 028d2410-947f-41bd-b19d-000000000043 25675 1727204008.21459: variable 'ansible_search_path' from source: unknown 25675 1727204008.21470: variable 'ansible_search_path' from source: unknown 25675 1727204008.21534: calling self._execute() 25675 1727204008.21710: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.21715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.21718: variable 'omit' from source: magic vars 25675 1727204008.22101: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.22118: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.22264: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.22452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.25242: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.25371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.25510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.25513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.25516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.25603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.25819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.25927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.25931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.25933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.25974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.26062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.26096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.26263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.26288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.26329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.26387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.26497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.26535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.26596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.26880: variable 'network_connections' from source: play vars 25675 1727204008.27016: variable 'profile' from source: play vars 25675 1727204008.27174: variable 'profile' from source: play vars 25675 1727204008.27188: variable 'interface' from source: set_fact 25675 1727204008.27267: variable 'interface' from source: set_fact 25675 1727204008.27374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204008.27592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204008.27634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204008.27680: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204008.27718: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204008.27772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204008.27805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204008.27869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.27872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204008.27926: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204008.28193: variable 'network_connections' from source: play vars 25675 1727204008.28209: variable 'profile' from source: play vars 25675 1727204008.28281: variable 'profile' from source: play vars 25675 1727204008.28298: variable 'interface' from source: set_fact 25675 1727204008.28380: variable 'interface' from source: set_fact 25675 1727204008.28396: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204008.28406: when evaluation is False, skipping this task 25675 1727204008.28421: _execute() done 25675 1727204008.28424: dumping result to json 25675 1727204008.28426: done dumping result, returning 25675 1727204008.28450: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000043] 25675 1727204008.28460: sending task result for task 028d2410-947f-41bd-b19d-000000000043 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204008.28707: no more pending results, returning what we have 25675 1727204008.28711: results queue empty 25675 1727204008.28712: checking for any_errors_fatal 25675 1727204008.28719: done checking for any_errors_fatal 25675 1727204008.28719: checking for max_fail_percentage 25675 1727204008.28721: done checking for max_fail_percentage 25675 1727204008.28722: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.28723: done checking to see if all hosts have failed 25675 1727204008.28724: getting the remaining hosts for this loop 25675 1727204008.28725: done getting the remaining hosts for this loop 25675 1727204008.28729: getting the next task for host managed-node2 25675 1727204008.28735: done getting next task for host managed-node2 25675 1727204008.28739: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25675 1727204008.28741: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.28756: getting variables 25675 1727204008.28759: in VariableManager get_vars() 25675 1727204008.28800: Calling all_inventory to load vars for managed-node2 25675 1727204008.28803: Calling groups_inventory to load vars for managed-node2 25675 1727204008.28806: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.28816: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.28820: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.28822: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.29389: done sending task result for task 028d2410-947f-41bd-b19d-000000000043 25675 1727204008.29392: WORKER PROCESS EXITING 25675 1727204008.30670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.32309: done with get_vars() 25675 1727204008.32340: done getting variables 25675 1727204008.32415: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.122) 0:00:27.775 ***** 25675 1727204008.32449: entering _queue_task() for managed-node2/package 25675 1727204008.32998: worker is 1 (out of 1 available) 25675 1727204008.33009: exiting _queue_task() for managed-node2/package 25675 1727204008.33020: done queuing things up, now waiting for results queue to drain 25675 1727204008.33021: waiting for pending results... 25675 1727204008.33146: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 25675 1727204008.33362: in run() - task 028d2410-947f-41bd-b19d-000000000044 25675 1727204008.33366: variable 'ansible_search_path' from source: unknown 25675 1727204008.33368: variable 'ansible_search_path' from source: unknown 25675 1727204008.33371: calling self._execute() 25675 1727204008.33436: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.33449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.33478: variable 'omit' from source: magic vars 25675 1727204008.33913: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.33930: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.34152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204008.34455: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204008.34507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204008.34544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204008.34593: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204008.34715: variable 'network_packages' from source: role '' defaults 25675 1727204008.34826: variable '__network_provider_setup' from source: role '' defaults 25675 1727204008.34886: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204008.34919: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204008.34932: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204008.35055: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204008.35256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.37272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.37362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.37480: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.37483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.37485: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.37570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.37616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.37648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.37695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.37722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.37771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.37805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.37843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.37890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.37910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.38262: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727204008.38293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.38323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.38353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.38412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.38432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.38536: variable 'ansible_python' from source: facts 25675 1727204008.38567: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727204008.38670: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204008.38765: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204008.38907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.38948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.38981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.39036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.39057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.39142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.39153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.39187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.39252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.39255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.39416: variable 'network_connections' from source: play vars 25675 1727204008.39426: variable 'profile' from source: play vars 25675 1727204008.39681: variable 'profile' from source: play vars 25675 1727204008.39684: variable 'interface' from source: set_fact 25675 1727204008.39686: variable 'interface' from source: set_fact 25675 1727204008.39697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204008.39727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204008.39757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.39793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204008.39852: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.40157: variable 'network_connections' from source: play vars 25675 1727204008.40167: variable 'profile' from source: play vars 25675 1727204008.40280: variable 'profile' from source: play vars 25675 1727204008.40294: variable 'interface' from source: set_fact 25675 1727204008.40370: variable 'interface' from source: set_fact 25675 1727204008.40406: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727204008.40570: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.40832: variable 'network_connections' from source: play vars 25675 1727204008.40842: variable 'profile' from source: play vars 25675 1727204008.40919: variable 'profile' from source: play vars 25675 1727204008.40929: variable 'interface' from source: set_fact 25675 1727204008.41040: variable 'interface' from source: set_fact 25675 1727204008.41071: variable '__network_packages_default_team' from source: role '' defaults 25675 1727204008.41173: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204008.41536: variable 'network_connections' from source: play vars 25675 1727204008.41558: variable 'profile' from source: play vars 25675 1727204008.41625: variable 'profile' from source: play vars 25675 1727204008.41635: variable 'interface' from source: set_fact 25675 1727204008.41767: variable 'interface' from source: set_fact 25675 1727204008.41815: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204008.41895: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204008.41907: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204008.41987: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204008.42210: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727204008.42749: variable 'network_connections' from source: play vars 25675 1727204008.42752: variable 'profile' from source: play vars 25675 1727204008.42771: variable 'profile' from source: play vars 25675 1727204008.42782: variable 'interface' from source: set_fact 25675 1727204008.42856: variable 'interface' from source: set_fact 25675 1727204008.42874: variable 'ansible_distribution' from source: facts 25675 1727204008.43080: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.43083: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.43086: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727204008.43088: variable 'ansible_distribution' from source: facts 25675 1727204008.43090: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.43121: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.43151: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727204008.43335: variable 'ansible_distribution' from source: facts 25675 1727204008.43344: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.43354: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.43397: variable 'network_provider' from source: set_fact 25675 1727204008.43426: variable 'ansible_facts' from source: unknown 25675 1727204008.44214: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25675 1727204008.44222: when evaluation is False, skipping this task 25675 1727204008.44230: _execute() done 25675 1727204008.44237: dumping result to json 25675 1727204008.44295: done dumping result, returning 25675 1727204008.44298: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-41bd-b19d-000000000044] 25675 1727204008.44301: sending task result for task 028d2410-947f-41bd-b19d-000000000044 25675 1727204008.44453: done sending task result for task 028d2410-947f-41bd-b19d-000000000044 25675 1727204008.44455: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25675 1727204008.44519: no more pending results, returning what we have 25675 1727204008.44524: results queue empty 25675 1727204008.44525: checking for any_errors_fatal 25675 1727204008.44531: done checking for any_errors_fatal 25675 1727204008.44532: checking for max_fail_percentage 25675 1727204008.44534: done checking for max_fail_percentage 25675 1727204008.44535: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.44536: done checking to see if all hosts have failed 25675 1727204008.44537: getting the remaining hosts for this loop 25675 1727204008.44539: done getting the remaining hosts for this loop 25675 1727204008.44543: getting the next task for host managed-node2 25675 1727204008.44551: done getting next task for host managed-node2 25675 1727204008.44554: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727204008.44557: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.44572: getting variables 25675 1727204008.44575: in VariableManager get_vars() 25675 1727204008.44736: Calling all_inventory to load vars for managed-node2 25675 1727204008.44739: Calling groups_inventory to load vars for managed-node2 25675 1727204008.44742: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.44757: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.44760: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.44762: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.46674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.48694: done with get_vars() 25675 1727204008.48717: done getting variables 25675 1727204008.48790: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.163) 0:00:27.939 ***** 25675 1727204008.48820: entering _queue_task() for managed-node2/package 25675 1727204008.49234: worker is 1 (out of 1 available) 25675 1727204008.49247: exiting _queue_task() for managed-node2/package 25675 1727204008.49261: done queuing things up, now waiting for results queue to drain 25675 1727204008.49262: waiting for pending results... 25675 1727204008.49594: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727204008.49605: in run() - task 028d2410-947f-41bd-b19d-000000000045 25675 1727204008.49637: variable 'ansible_search_path' from source: unknown 25675 1727204008.49645: variable 'ansible_search_path' from source: unknown 25675 1727204008.49691: calling self._execute() 25675 1727204008.49792: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.49804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.49838: variable 'omit' from source: magic vars 25675 1727204008.50233: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.50272: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.50388: variable 'network_state' from source: role '' defaults 25675 1727204008.50405: Evaluated conditional (network_state != {}): False 25675 1727204008.50484: when evaluation is False, skipping this task 25675 1727204008.50487: _execute() done 25675 1727204008.50490: dumping result to json 25675 1727204008.50492: done dumping result, returning 25675 1727204008.50495: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000045] 25675 1727204008.50498: sending task result for task 028d2410-947f-41bd-b19d-000000000045 25675 1727204008.50569: done sending task result for task 028d2410-947f-41bd-b19d-000000000045 25675 1727204008.50573: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204008.50628: no more pending results, returning what we have 25675 1727204008.50632: results queue empty 25675 1727204008.50632: checking for any_errors_fatal 25675 1727204008.50639: done checking for any_errors_fatal 25675 1727204008.50640: checking for max_fail_percentage 25675 1727204008.50642: done checking for max_fail_percentage 25675 1727204008.50643: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.50644: done checking to see if all hosts have failed 25675 1727204008.50644: getting the remaining hosts for this loop 25675 1727204008.50646: done getting the remaining hosts for this loop 25675 1727204008.50649: getting the next task for host managed-node2 25675 1727204008.50655: done getting next task for host managed-node2 25675 1727204008.50659: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727204008.50661: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.50680: getting variables 25675 1727204008.50682: in VariableManager get_vars() 25675 1727204008.50719: Calling all_inventory to load vars for managed-node2 25675 1727204008.50722: Calling groups_inventory to load vars for managed-node2 25675 1727204008.50724: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.50735: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.50737: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.50740: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.53693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.56942: done with get_vars() 25675 1727204008.56980: done getting variables 25675 1727204008.57057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.082) 0:00:28.022 ***** 25675 1727204008.57101: entering _queue_task() for managed-node2/package 25675 1727204008.57555: worker is 1 (out of 1 available) 25675 1727204008.57568: exiting _queue_task() for managed-node2/package 25675 1727204008.57649: done queuing things up, now waiting for results queue to drain 25675 1727204008.57651: waiting for pending results... 25675 1727204008.57993: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727204008.58000: in run() - task 028d2410-947f-41bd-b19d-000000000046 25675 1727204008.58011: variable 'ansible_search_path' from source: unknown 25675 1727204008.58088: variable 'ansible_search_path' from source: unknown 25675 1727204008.58093: calling self._execute() 25675 1727204008.58165: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.58180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.58206: variable 'omit' from source: magic vars 25675 1727204008.59294: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.59299: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.59440: variable 'network_state' from source: role '' defaults 25675 1727204008.59526: Evaluated conditional (network_state != {}): False 25675 1727204008.59534: when evaluation is False, skipping this task 25675 1727204008.59542: _execute() done 25675 1727204008.59552: dumping result to json 25675 1727204008.59561: done dumping result, returning 25675 1727204008.59577: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000046] 25675 1727204008.59730: sending task result for task 028d2410-947f-41bd-b19d-000000000046 25675 1727204008.59811: done sending task result for task 028d2410-947f-41bd-b19d-000000000046 25675 1727204008.59815: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204008.59885: no more pending results, returning what we have 25675 1727204008.59889: results queue empty 25675 1727204008.59890: checking for any_errors_fatal 25675 1727204008.59897: done checking for any_errors_fatal 25675 1727204008.59897: checking for max_fail_percentage 25675 1727204008.59899: done checking for max_fail_percentage 25675 1727204008.59900: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.59901: done checking to see if all hosts have failed 25675 1727204008.59902: getting the remaining hosts for this loop 25675 1727204008.59904: done getting the remaining hosts for this loop 25675 1727204008.59908: getting the next task for host managed-node2 25675 1727204008.59916: done getting next task for host managed-node2 25675 1727204008.59920: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727204008.59922: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.60057: getting variables 25675 1727204008.60060: in VariableManager get_vars() 25675 1727204008.60105: Calling all_inventory to load vars for managed-node2 25675 1727204008.60109: Calling groups_inventory to load vars for managed-node2 25675 1727204008.60111: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.60123: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.60127: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.60130: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.63602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.72724: done with get_vars() 25675 1727204008.72757: done getting variables 25675 1727204008.72839: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.157) 0:00:28.179 ***** 25675 1727204008.72867: entering _queue_task() for managed-node2/service 25675 1727204008.73802: worker is 1 (out of 1 available) 25675 1727204008.73812: exiting _queue_task() for managed-node2/service 25675 1727204008.73824: done queuing things up, now waiting for results queue to drain 25675 1727204008.73825: waiting for pending results... 25675 1727204008.74340: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727204008.74544: in run() - task 028d2410-947f-41bd-b19d-000000000047 25675 1727204008.74548: variable 'ansible_search_path' from source: unknown 25675 1727204008.74551: variable 'ansible_search_path' from source: unknown 25675 1727204008.74554: calling self._execute() 25675 1727204008.74619: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.74632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.74660: variable 'omit' from source: magic vars 25675 1727204008.75069: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.75096: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.75221: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.75424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.78035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.78114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.78151: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.78208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.78245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.78312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.78335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.78385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.78405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.78421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.78467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.78528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.78532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.78570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.78589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.78635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.78657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.78685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.78723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.78747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.78921: variable 'network_connections' from source: play vars 25675 1727204008.78935: variable 'profile' from source: play vars 25675 1727204008.79011: variable 'profile' from source: play vars 25675 1727204008.79014: variable 'interface' from source: set_fact 25675 1727204008.79082: variable 'interface' from source: set_fact 25675 1727204008.79150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204008.79383: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204008.79390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204008.79397: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204008.79425: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204008.79468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204008.79496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204008.79518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.79548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204008.79599: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204008.79830: variable 'network_connections' from source: play vars 25675 1727204008.79834: variable 'profile' from source: play vars 25675 1727204008.79929: variable 'profile' from source: play vars 25675 1727204008.79937: variable 'interface' from source: set_fact 25675 1727204008.79946: variable 'interface' from source: set_fact 25675 1727204008.79970: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204008.79974: when evaluation is False, skipping this task 25675 1727204008.79982: _execute() done 25675 1727204008.79985: dumping result to json 25675 1727204008.79987: done dumping result, returning 25675 1727204008.80037: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000047] 25675 1727204008.80053: sending task result for task 028d2410-947f-41bd-b19d-000000000047 25675 1727204008.80116: done sending task result for task 028d2410-947f-41bd-b19d-000000000047 25675 1727204008.80118: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204008.80186: no more pending results, returning what we have 25675 1727204008.80190: results queue empty 25675 1727204008.80190: checking for any_errors_fatal 25675 1727204008.80200: done checking for any_errors_fatal 25675 1727204008.80201: checking for max_fail_percentage 25675 1727204008.80203: done checking for max_fail_percentage 25675 1727204008.80203: checking to see if all hosts have failed and the running result is not ok 25675 1727204008.80204: done checking to see if all hosts have failed 25675 1727204008.80205: getting the remaining hosts for this loop 25675 1727204008.80206: done getting the remaining hosts for this loop 25675 1727204008.80210: getting the next task for host managed-node2 25675 1727204008.80215: done getting next task for host managed-node2 25675 1727204008.80219: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727204008.80220: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204008.80235: getting variables 25675 1727204008.80237: in VariableManager get_vars() 25675 1727204008.80282: Calling all_inventory to load vars for managed-node2 25675 1727204008.80285: Calling groups_inventory to load vars for managed-node2 25675 1727204008.80288: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204008.80297: Calling all_plugins_play to load vars for managed-node2 25675 1727204008.80299: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204008.80301: Calling groups_plugins_play to load vars for managed-node2 25675 1727204008.81780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204008.83809: done with get_vars() 25675 1727204008.83839: done getting variables 25675 1727204008.83913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.110) 0:00:28.290 ***** 25675 1727204008.83945: entering _queue_task() for managed-node2/service 25675 1727204008.84386: worker is 1 (out of 1 available) 25675 1727204008.84400: exiting _queue_task() for managed-node2/service 25675 1727204008.84411: done queuing things up, now waiting for results queue to drain 25675 1727204008.84413: waiting for pending results... 25675 1727204008.84797: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727204008.84803: in run() - task 028d2410-947f-41bd-b19d-000000000048 25675 1727204008.84807: variable 'ansible_search_path' from source: unknown 25675 1727204008.84810: variable 'ansible_search_path' from source: unknown 25675 1727204008.84826: calling self._execute() 25675 1727204008.84925: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204008.84932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204008.84942: variable 'omit' from source: magic vars 25675 1727204008.85347: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.85383: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204008.85548: variable 'network_provider' from source: set_fact 25675 1727204008.85552: variable 'network_state' from source: role '' defaults 25675 1727204008.85557: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25675 1727204008.85582: variable 'omit' from source: magic vars 25675 1727204008.85610: variable 'omit' from source: magic vars 25675 1727204008.85646: variable 'network_service_name' from source: role '' defaults 25675 1727204008.85721: variable 'network_service_name' from source: role '' defaults 25675 1727204008.85837: variable '__network_provider_setup' from source: role '' defaults 25675 1727204008.85847: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204008.85984: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204008.85987: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204008.85990: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204008.86203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204008.88388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204008.88809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204008.88842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204008.88872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204008.88907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204008.88979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.89185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.89189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.89192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.89194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.89196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.89198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.89205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.89254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.89268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.89781: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727204008.89962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.90069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.90072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.90077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.90080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.90174: variable 'ansible_python' from source: facts 25675 1727204008.90200: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727204008.90480: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204008.90484: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204008.90582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.90586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.90589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.90592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.90607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.90654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204008.90690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204008.90712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.90750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204008.90764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204008.90981: variable 'network_connections' from source: play vars 25675 1727204008.90984: variable 'profile' from source: play vars 25675 1727204008.91000: variable 'profile' from source: play vars 25675 1727204008.91014: variable 'interface' from source: set_fact 25675 1727204008.91086: variable 'interface' from source: set_fact 25675 1727204008.91221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204008.91558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204008.91754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204008.91814: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204008.91860: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204008.91934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204008.91964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204008.92003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204008.92181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204008.92185: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.92392: variable 'network_connections' from source: play vars 25675 1727204008.92398: variable 'profile' from source: play vars 25675 1727204008.92468: variable 'profile' from source: play vars 25675 1727204008.92487: variable 'interface' from source: set_fact 25675 1727204008.92581: variable 'interface' from source: set_fact 25675 1727204008.92585: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727204008.92666: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204008.93047: variable 'network_connections' from source: play vars 25675 1727204008.93050: variable 'profile' from source: play vars 25675 1727204008.93124: variable 'profile' from source: play vars 25675 1727204008.93172: variable 'interface' from source: set_fact 25675 1727204008.93266: variable 'interface' from source: set_fact 25675 1727204008.93294: variable '__network_packages_default_team' from source: role '' defaults 25675 1727204008.93383: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204008.93973: variable 'network_connections' from source: play vars 25675 1727204008.93978: variable 'profile' from source: play vars 25675 1727204008.94255: variable 'profile' from source: play vars 25675 1727204008.94259: variable 'interface' from source: set_fact 25675 1727204008.94676: variable 'interface' from source: set_fact 25675 1727204008.94735: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204008.94969: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204008.94973: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204008.95114: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204008.96118: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727204008.97842: variable 'network_connections' from source: play vars 25675 1727204008.97846: variable 'profile' from source: play vars 25675 1727204008.97915: variable 'profile' from source: play vars 25675 1727204008.98035: variable 'interface' from source: set_fact 25675 1727204008.98400: variable 'interface' from source: set_fact 25675 1727204008.98411: variable 'ansible_distribution' from source: facts 25675 1727204008.98413: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.98416: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.98483: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727204008.99132: variable 'ansible_distribution' from source: facts 25675 1727204008.99135: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.99140: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.99153: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727204008.99605: variable 'ansible_distribution' from source: facts 25675 1727204008.99611: variable '__network_rh_distros' from source: role '' defaults 25675 1727204008.99614: variable 'ansible_distribution_major_version' from source: facts 25675 1727204008.99644: variable 'network_provider' from source: set_fact 25675 1727204008.99884: variable 'omit' from source: magic vars 25675 1727204008.99887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204008.99967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204009.00103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204009.00120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204009.00131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204009.00160: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204009.00163: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204009.00165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204009.00481: Set connection var ansible_shell_type to sh 25675 1727204009.00586: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204009.00589: Set connection var ansible_timeout to 10 25675 1727204009.00592: Set connection var ansible_pipelining to False 25675 1727204009.00594: Set connection var ansible_shell_executable to /bin/sh 25675 1727204009.00596: Set connection var ansible_connection to ssh 25675 1727204009.00598: variable 'ansible_shell_executable' from source: unknown 25675 1727204009.00600: variable 'ansible_connection' from source: unknown 25675 1727204009.00603: variable 'ansible_module_compression' from source: unknown 25675 1727204009.00605: variable 'ansible_shell_type' from source: unknown 25675 1727204009.00607: variable 'ansible_shell_executable' from source: unknown 25675 1727204009.00608: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204009.00615: variable 'ansible_pipelining' from source: unknown 25675 1727204009.00617: variable 'ansible_timeout' from source: unknown 25675 1727204009.00619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204009.00896: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204009.00909: variable 'omit' from source: magic vars 25675 1727204009.00912: starting attempt loop 25675 1727204009.00915: running the handler 25675 1727204009.01182: variable 'ansible_facts' from source: unknown 25675 1727204009.02384: _low_level_execute_command(): starting 25675 1727204009.02387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204009.02966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204009.02987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204009.02999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204009.03044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204009.03056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.03086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.03158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.04910: stdout chunk (state=3): >>>/root <<< 25675 1727204009.05053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204009.05057: stdout chunk (state=3): >>><<< 25675 1727204009.05059: stderr chunk (state=3): >>><<< 25675 1727204009.05102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204009.05124: _low_level_execute_command(): starting 25675 1727204009.05210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312 `" && echo ansible-tmp-1727204009.051116-28051-243997563190312="` echo /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312 `" ) && sleep 0' 25675 1727204009.05765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204009.05888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204009.05891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204009.05905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.05921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.06015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.08002: stdout chunk (state=3): >>>ansible-tmp-1727204009.051116-28051-243997563190312=/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312 <<< 25675 1727204009.08096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204009.08107: stdout chunk (state=3): >>><<< 25675 1727204009.08142: stderr chunk (state=3): >>><<< 25675 1727204009.08285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204009.051116-28051-243997563190312=/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204009.08292: variable 'ansible_module_compression' from source: unknown 25675 1727204009.08349: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 25675 1727204009.08534: variable 'ansible_facts' from source: unknown 25675 1727204009.09106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py 25675 1727204009.09342: Sending initial data 25675 1727204009.09389: Sent initial data (155 bytes) 25675 1727204009.10038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204009.10042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204009.10045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204009.10047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204009.10050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204009.10109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.10138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.10242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.11914: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204009.11982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204009.12053: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp_8_1ls3o /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py <<< 25675 1727204009.12060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py" <<< 25675 1727204009.12204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp_8_1ls3o" to remote "/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py" <<< 25675 1727204009.15196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204009.15208: stdout chunk (state=3): >>><<< 25675 1727204009.15219: stderr chunk (state=3): >>><<< 25675 1727204009.15284: done transferring module to remote 25675 1727204009.15300: _low_level_execute_command(): starting 25675 1727204009.15309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/ /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py && sleep 0' 25675 1727204009.15908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204009.15924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204009.15940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204009.15964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204009.15991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204009.16004: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204009.16017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204009.16061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727204009.16091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204009.16165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.16432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.16522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.18381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204009.18395: stdout chunk (state=3): >>><<< 25675 1727204009.18409: stderr chunk (state=3): >>><<< 25675 1727204009.18427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204009.18435: _low_level_execute_command(): starting 25675 1727204009.18443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/AnsiballZ_systemd.py && sleep 0' 25675 1727204009.19025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204009.19039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204009.19051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204009.19064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204009.19084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204009.19174: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.19205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.19310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.48396: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4493312", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301191680", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "746940000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 25675 1727204009.48419: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25675 1727204009.50650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204009.50653: stdout chunk (state=3): >>><<< 25675 1727204009.50655: stderr chunk (state=3): >>><<< 25675 1727204009.50684: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4493312", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301191680", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "746940000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204009.51091: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204009.51202: _low_level_execute_command(): starting 25675 1727204009.51205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204009.051116-28051-243997563190312/ > /dev/null 2>&1 && sleep 0' 25675 1727204009.52283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204009.52583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204009.52587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204009.52589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204009.52755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204009.52758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204009.54750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204009.54754: stdout chunk (state=3): >>><<< 25675 1727204009.54786: stderr chunk (state=3): >>><<< 25675 1727204009.54790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204009.54792: handler run complete 25675 1727204009.54852: attempt loop complete, returning result 25675 1727204009.54855: _execute() done 25675 1727204009.54858: dumping result to json 25675 1727204009.54891: done dumping result, returning 25675 1727204009.54897: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-41bd-b19d-000000000048] 25675 1727204009.54899: sending task result for task 028d2410-947f-41bd-b19d-000000000048 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204009.55732: no more pending results, returning what we have 25675 1727204009.55736: results queue empty 25675 1727204009.55737: checking for any_errors_fatal 25675 1727204009.55741: done checking for any_errors_fatal 25675 1727204009.55742: checking for max_fail_percentage 25675 1727204009.55744: done checking for max_fail_percentage 25675 1727204009.55745: checking to see if all hosts have failed and the running result is not ok 25675 1727204009.55746: done checking to see if all hosts have failed 25675 1727204009.55746: getting the remaining hosts for this loop 25675 1727204009.55748: done getting the remaining hosts for this loop 25675 1727204009.55752: getting the next task for host managed-node2 25675 1727204009.55758: done getting next task for host managed-node2 25675 1727204009.55762: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727204009.55764: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204009.55774: getting variables 25675 1727204009.55780: in VariableManager get_vars() 25675 1727204009.55817: Calling all_inventory to load vars for managed-node2 25675 1727204009.55820: Calling groups_inventory to load vars for managed-node2 25675 1727204009.55822: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204009.55831: Calling all_plugins_play to load vars for managed-node2 25675 1727204009.55833: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204009.55835: Calling groups_plugins_play to load vars for managed-node2 25675 1727204009.56429: done sending task result for task 028d2410-947f-41bd-b19d-000000000048 25675 1727204009.56432: WORKER PROCESS EXITING 25675 1727204009.59308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204009.63202: done with get_vars() 25675 1727204009.63292: done getting variables 25675 1727204009.63480: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.795) 0:00:29.086 ***** 25675 1727204009.63508: entering _queue_task() for managed-node2/service 25675 1727204009.64299: worker is 1 (out of 1 available) 25675 1727204009.64312: exiting _queue_task() for managed-node2/service 25675 1727204009.64439: done queuing things up, now waiting for results queue to drain 25675 1727204009.64441: waiting for pending results... 25675 1727204009.64859: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727204009.64957: in run() - task 028d2410-947f-41bd-b19d-000000000049 25675 1727204009.65098: variable 'ansible_search_path' from source: unknown 25675 1727204009.65106: variable 'ansible_search_path' from source: unknown 25675 1727204009.65109: calling self._execute() 25675 1727204009.65123: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204009.65135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204009.65149: variable 'omit' from source: magic vars 25675 1727204009.65651: variable 'ansible_distribution_major_version' from source: facts 25675 1727204009.65654: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204009.65691: variable 'network_provider' from source: set_fact 25675 1727204009.65702: Evaluated conditional (network_provider == "nm"): True 25675 1727204009.65800: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204009.65896: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204009.66247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204009.70737: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204009.71184: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204009.71192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204009.71194: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204009.71197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204009.71251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204009.71582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204009.71586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204009.71589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204009.71606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204009.71655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204009.71682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204009.71708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204009.71824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204009.71846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204009.71898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204009.72284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204009.72287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204009.72289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204009.72291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204009.72409: variable 'network_connections' from source: play vars 25675 1727204009.72427: variable 'profile' from source: play vars 25675 1727204009.72513: variable 'profile' from source: play vars 25675 1727204009.72884: variable 'interface' from source: set_fact 25675 1727204009.72887: variable 'interface' from source: set_fact 25675 1727204009.73099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204009.73267: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204009.73520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204009.73557: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204009.73881: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204009.73884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204009.73886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204009.73903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204009.73928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204009.73982: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204009.74539: variable 'network_connections' from source: play vars 25675 1727204009.74551: variable 'profile' from source: play vars 25675 1727204009.74623: variable 'profile' from source: play vars 25675 1727204009.74791: variable 'interface' from source: set_fact 25675 1727204009.74854: variable 'interface' from source: set_fact 25675 1727204009.74897: Evaluated conditional (__network_wpa_supplicant_required): False 25675 1727204009.75281: when evaluation is False, skipping this task 25675 1727204009.75285: _execute() done 25675 1727204009.75296: dumping result to json 25675 1727204009.75299: done dumping result, returning 25675 1727204009.75301: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-41bd-b19d-000000000049] 25675 1727204009.75304: sending task result for task 028d2410-947f-41bd-b19d-000000000049 25675 1727204009.75380: done sending task result for task 028d2410-947f-41bd-b19d-000000000049 25675 1727204009.75385: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25675 1727204009.75443: no more pending results, returning what we have 25675 1727204009.75447: results queue empty 25675 1727204009.75448: checking for any_errors_fatal 25675 1727204009.75481: done checking for any_errors_fatal 25675 1727204009.75482: checking for max_fail_percentage 25675 1727204009.75484: done checking for max_fail_percentage 25675 1727204009.75492: checking to see if all hosts have failed and the running result is not ok 25675 1727204009.75494: done checking to see if all hosts have failed 25675 1727204009.75494: getting the remaining hosts for this loop 25675 1727204009.75496: done getting the remaining hosts for this loop 25675 1727204009.75501: getting the next task for host managed-node2 25675 1727204009.75508: done getting next task for host managed-node2 25675 1727204009.75512: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25675 1727204009.75514: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204009.75530: getting variables 25675 1727204009.75532: in VariableManager get_vars() 25675 1727204009.75569: Calling all_inventory to load vars for managed-node2 25675 1727204009.75572: Calling groups_inventory to load vars for managed-node2 25675 1727204009.75574: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204009.75766: Calling all_plugins_play to load vars for managed-node2 25675 1727204009.75769: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204009.75771: Calling groups_plugins_play to load vars for managed-node2 25675 1727204009.78371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204009.82554: done with get_vars() 25675 1727204009.82707: done getting variables 25675 1727204009.82770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.192) 0:00:29.279 ***** 25675 1727204009.82808: entering _queue_task() for managed-node2/service 25675 1727204009.83228: worker is 1 (out of 1 available) 25675 1727204009.83242: exiting _queue_task() for managed-node2/service 25675 1727204009.83253: done queuing things up, now waiting for results queue to drain 25675 1727204009.83254: waiting for pending results... 25675 1727204009.83548: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 25675 1727204009.83683: in run() - task 028d2410-947f-41bd-b19d-00000000004a 25675 1727204009.83710: variable 'ansible_search_path' from source: unknown 25675 1727204009.83719: variable 'ansible_search_path' from source: unknown 25675 1727204009.83761: calling self._execute() 25675 1727204009.83893: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204009.83905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204009.83925: variable 'omit' from source: magic vars 25675 1727204009.84531: variable 'ansible_distribution_major_version' from source: facts 25675 1727204009.84595: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204009.84718: variable 'network_provider' from source: set_fact 25675 1727204009.84801: Evaluated conditional (network_provider == "initscripts"): False 25675 1727204009.84809: when evaluation is False, skipping this task 25675 1727204009.84817: _execute() done 25675 1727204009.84824: dumping result to json 25675 1727204009.84830: done dumping result, returning 25675 1727204009.84842: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-41bd-b19d-00000000004a] 25675 1727204009.84851: sending task result for task 028d2410-947f-41bd-b19d-00000000004a skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204009.85053: no more pending results, returning what we have 25675 1727204009.85058: results queue empty 25675 1727204009.85059: checking for any_errors_fatal 25675 1727204009.85066: done checking for any_errors_fatal 25675 1727204009.85067: checking for max_fail_percentage 25675 1727204009.85070: done checking for max_fail_percentage 25675 1727204009.85071: checking to see if all hosts have failed and the running result is not ok 25675 1727204009.85071: done checking to see if all hosts have failed 25675 1727204009.85072: getting the remaining hosts for this loop 25675 1727204009.85073: done getting the remaining hosts for this loop 25675 1727204009.85081: getting the next task for host managed-node2 25675 1727204009.85088: done getting next task for host managed-node2 25675 1727204009.85092: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727204009.85095: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204009.85112: getting variables 25675 1727204009.85115: in VariableManager get_vars() 25675 1727204009.85160: Calling all_inventory to load vars for managed-node2 25675 1727204009.85163: Calling groups_inventory to load vars for managed-node2 25675 1727204009.85166: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204009.85688: Calling all_plugins_play to load vars for managed-node2 25675 1727204009.85693: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204009.85698: Calling groups_plugins_play to load vars for managed-node2 25675 1727204009.86469: done sending task result for task 028d2410-947f-41bd-b19d-00000000004a 25675 1727204009.86473: WORKER PROCESS EXITING 25675 1727204009.87771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204009.89713: done with get_vars() 25675 1727204009.89747: done getting variables 25675 1727204009.89808: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.070) 0:00:29.349 ***** 25675 1727204009.89840: entering _queue_task() for managed-node2/copy 25675 1727204009.90569: worker is 1 (out of 1 available) 25675 1727204009.90620: exiting _queue_task() for managed-node2/copy 25675 1727204009.90634: done queuing things up, now waiting for results queue to drain 25675 1727204009.90635: waiting for pending results... 25675 1727204009.91086: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727204009.91169: in run() - task 028d2410-947f-41bd-b19d-00000000004b 25675 1727204009.91405: variable 'ansible_search_path' from source: unknown 25675 1727204009.91409: variable 'ansible_search_path' from source: unknown 25675 1727204009.91444: calling self._execute() 25675 1727204009.91534: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204009.91541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204009.91551: variable 'omit' from source: magic vars 25675 1727204009.92463: variable 'ansible_distribution_major_version' from source: facts 25675 1727204009.92473: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204009.92768: variable 'network_provider' from source: set_fact 25675 1727204009.92773: Evaluated conditional (network_provider == "initscripts"): False 25675 1727204009.92779: when evaluation is False, skipping this task 25675 1727204009.92782: _execute() done 25675 1727204009.92785: dumping result to json 25675 1727204009.92787: done dumping result, returning 25675 1727204009.92798: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-41bd-b19d-00000000004b] 25675 1727204009.92801: sending task result for task 028d2410-947f-41bd-b19d-00000000004b 25675 1727204009.92981: done sending task result for task 028d2410-947f-41bd-b19d-00000000004b 25675 1727204009.92985: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25675 1727204009.93029: no more pending results, returning what we have 25675 1727204009.93032: results queue empty 25675 1727204009.93033: checking for any_errors_fatal 25675 1727204009.93037: done checking for any_errors_fatal 25675 1727204009.93038: checking for max_fail_percentage 25675 1727204009.93039: done checking for max_fail_percentage 25675 1727204009.93040: checking to see if all hosts have failed and the running result is not ok 25675 1727204009.93041: done checking to see if all hosts have failed 25675 1727204009.93042: getting the remaining hosts for this loop 25675 1727204009.93043: done getting the remaining hosts for this loop 25675 1727204009.93046: getting the next task for host managed-node2 25675 1727204009.93051: done getting next task for host managed-node2 25675 1727204009.93054: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727204009.93056: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204009.93070: getting variables 25675 1727204009.93071: in VariableManager get_vars() 25675 1727204009.93105: Calling all_inventory to load vars for managed-node2 25675 1727204009.93108: Calling groups_inventory to load vars for managed-node2 25675 1727204009.93110: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204009.93117: Calling all_plugins_play to load vars for managed-node2 25675 1727204009.93120: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204009.93122: Calling groups_plugins_play to load vars for managed-node2 25675 1727204009.95650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204009.97744: done with get_vars() 25675 1727204009.97939: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.081) 0:00:29.431 ***** 25675 1727204009.98028: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727204009.99006: worker is 1 (out of 1 available) 25675 1727204009.99020: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727204009.99032: done queuing things up, now waiting for results queue to drain 25675 1727204009.99033: waiting for pending results... 25675 1727204009.99779: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727204010.00029: in run() - task 028d2410-947f-41bd-b19d-00000000004c 25675 1727204010.00054: variable 'ansible_search_path' from source: unknown 25675 1727204010.00384: variable 'ansible_search_path' from source: unknown 25675 1727204010.00388: calling self._execute() 25675 1727204010.00390: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.00393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.00395: variable 'omit' from source: magic vars 25675 1727204010.01242: variable 'ansible_distribution_major_version' from source: facts 25675 1727204010.01261: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204010.01280: variable 'omit' from source: magic vars 25675 1727204010.01328: variable 'omit' from source: magic vars 25675 1727204010.01648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204010.05944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204010.06031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204010.06072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204010.06126: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204010.06224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204010.06259: variable 'network_provider' from source: set_fact 25675 1727204010.06410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204010.06471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204010.06509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204010.06560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204010.06584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204010.06674: variable 'omit' from source: magic vars 25675 1727204010.06809: variable 'omit' from source: magic vars 25675 1727204010.06987: variable 'network_connections' from source: play vars 25675 1727204010.06990: variable 'profile' from source: play vars 25675 1727204010.07021: variable 'profile' from source: play vars 25675 1727204010.07031: variable 'interface' from source: set_fact 25675 1727204010.07102: variable 'interface' from source: set_fact 25675 1727204010.07253: variable 'omit' from source: magic vars 25675 1727204010.07268: variable '__lsr_ansible_managed' from source: task vars 25675 1727204010.07340: variable '__lsr_ansible_managed' from source: task vars 25675 1727204010.07651: Loaded config def from plugin (lookup/template) 25675 1727204010.07662: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25675 1727204010.07699: File lookup term: get_ansible_managed.j2 25675 1727204010.07745: variable 'ansible_search_path' from source: unknown 25675 1727204010.07749: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25675 1727204010.07752: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25675 1727204010.07761: variable 'ansible_search_path' from source: unknown 25675 1727204010.14885: variable 'ansible_managed' from source: unknown 25675 1727204010.14890: variable 'omit' from source: magic vars 25675 1727204010.14900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204010.14931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204010.14947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204010.14964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.14980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.15006: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204010.15013: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.15016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.15123: Set connection var ansible_shell_type to sh 25675 1727204010.15127: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204010.15129: Set connection var ansible_timeout to 10 25675 1727204010.15132: Set connection var ansible_pipelining to False 25675 1727204010.15133: Set connection var ansible_shell_executable to /bin/sh 25675 1727204010.15135: Set connection var ansible_connection to ssh 25675 1727204010.15154: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.15157: variable 'ansible_connection' from source: unknown 25675 1727204010.15159: variable 'ansible_module_compression' from source: unknown 25675 1727204010.15162: variable 'ansible_shell_type' from source: unknown 25675 1727204010.15164: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.15167: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.15382: variable 'ansible_pipelining' from source: unknown 25675 1727204010.15385: variable 'ansible_timeout' from source: unknown 25675 1727204010.15388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.15391: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204010.15401: variable 'omit' from source: magic vars 25675 1727204010.15404: starting attempt loop 25675 1727204010.15406: running the handler 25675 1727204010.15408: _low_level_execute_command(): starting 25675 1727204010.15411: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204010.16061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204010.16084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204010.16088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204010.16098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204010.16110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204010.16116: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204010.16126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204010.16138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204010.16145: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204010.16151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204010.16164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204010.16174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204010.16188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204010.16284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204010.16288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204010.16560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.18271: stdout chunk (state=3): >>>/root <<< 25675 1727204010.18486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204010.18489: stdout chunk (state=3): >>><<< 25675 1727204010.18491: stderr chunk (state=3): >>><<< 25675 1727204010.18702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204010.18713: _low_level_execute_command(): starting 25675 1727204010.18720: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049 `" && echo ansible-tmp-1727204010.1869934-28165-272010648887049="` echo /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049 `" ) && sleep 0' 25675 1727204010.20317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204010.20504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.22434: stdout chunk (state=3): >>>ansible-tmp-1727204010.1869934-28165-272010648887049=/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049 <<< 25675 1727204010.22626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204010.22745: stderr chunk (state=3): >>><<< 25675 1727204010.22755: stdout chunk (state=3): >>><<< 25675 1727204010.22786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204010.1869934-28165-272010648887049=/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204010.22897: variable 'ansible_module_compression' from source: unknown 25675 1727204010.23027: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 25675 1727204010.23313: variable 'ansible_facts' from source: unknown 25675 1727204010.23556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py 25675 1727204010.23958: Sending initial data 25675 1727204010.23961: Sent initial data (168 bytes) 25675 1727204010.25099: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204010.25148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204010.25398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204010.25420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204010.25442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204010.25544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.27289: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204010.27348: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204010.27450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpg5xz0xu6 /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py <<< 25675 1727204010.27454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py" <<< 25675 1727204010.27606: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpg5xz0xu6" to remote "/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py" <<< 25675 1727204010.29715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204010.29719: stdout chunk (state=3): >>><<< 25675 1727204010.29721: stderr chunk (state=3): >>><<< 25675 1727204010.29931: done transferring module to remote 25675 1727204010.29935: _low_level_execute_command(): starting 25675 1727204010.29937: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/ /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py && sleep 0' 25675 1727204010.31514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204010.31532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204010.31593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204010.31839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204010.31860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204010.31959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.33883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204010.33897: stdout chunk (state=3): >>><<< 25675 1727204010.33909: stderr chunk (state=3): >>><<< 25675 1727204010.34069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204010.34072: _low_level_execute_command(): starting 25675 1727204010.34074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/AnsiballZ_network_connections.py && sleep 0' 25675 1727204010.35013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204010.35270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204010.35293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204010.35308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204010.35484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.65966: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25675 1727204010.68127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204010.68140: stdout chunk (state=3): >>><<< 25675 1727204010.68153: stderr chunk (state=3): >>><<< 25675 1727204010.68402: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204010.68405: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204010.68408: _low_level_execute_command(): starting 25675 1727204010.68409: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204010.1869934-28165-272010648887049/ > /dev/null 2>&1 && sleep 0' 25675 1727204010.69626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204010.69737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204010.70010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204010.72127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204010.72131: stdout chunk (state=3): >>><<< 25675 1727204010.72134: stderr chunk (state=3): >>><<< 25675 1727204010.72137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204010.72139: handler run complete 25675 1727204010.72248: attempt loop complete, returning result 25675 1727204010.72252: _execute() done 25675 1727204010.72254: dumping result to json 25675 1727204010.72257: done dumping result, returning 25675 1727204010.72259: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-41bd-b19d-00000000004c] 25675 1727204010.72261: sending task result for task 028d2410-947f-41bd-b19d-00000000004c 25675 1727204010.72519: done sending task result for task 028d2410-947f-41bd-b19d-00000000004c 25675 1727204010.72524: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 25675 1727204010.72751: no more pending results, returning what we have 25675 1727204010.72754: results queue empty 25675 1727204010.72755: checking for any_errors_fatal 25675 1727204010.72760: done checking for any_errors_fatal 25675 1727204010.72764: checking for max_fail_percentage 25675 1727204010.72768: done checking for max_fail_percentage 25675 1727204010.72768: checking to see if all hosts have failed and the running result is not ok 25675 1727204010.72769: done checking to see if all hosts have failed 25675 1727204010.72770: getting the remaining hosts for this loop 25675 1727204010.72771: done getting the remaining hosts for this loop 25675 1727204010.72777: getting the next task for host managed-node2 25675 1727204010.72783: done getting next task for host managed-node2 25675 1727204010.72786: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204010.72788: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204010.72798: getting variables 25675 1727204010.72800: in VariableManager get_vars() 25675 1727204010.72834: Calling all_inventory to load vars for managed-node2 25675 1727204010.72836: Calling groups_inventory to load vars for managed-node2 25675 1727204010.72839: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204010.72847: Calling all_plugins_play to load vars for managed-node2 25675 1727204010.72850: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204010.72852: Calling groups_plugins_play to load vars for managed-node2 25675 1727204010.75586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204010.79369: done with get_vars() 25675 1727204010.79410: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:30 -0400 (0:00:00.814) 0:00:30.246 ***** 25675 1727204010.79513: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204010.80034: worker is 1 (out of 1 available) 25675 1727204010.80050: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204010.80063: done queuing things up, now waiting for results queue to drain 25675 1727204010.80064: waiting for pending results... 25675 1727204010.81095: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204010.81156: in run() - task 028d2410-947f-41bd-b19d-00000000004d 25675 1727204010.81384: variable 'ansible_search_path' from source: unknown 25675 1727204010.81388: variable 'ansible_search_path' from source: unknown 25675 1727204010.81392: calling self._execute() 25675 1727204010.81456: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.81530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.81682: variable 'omit' from source: magic vars 25675 1727204010.82771: variable 'ansible_distribution_major_version' from source: facts 25675 1727204010.83002: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204010.84099: variable 'network_state' from source: role '' defaults 25675 1727204010.84104: Evaluated conditional (network_state != {}): False 25675 1727204010.84107: when evaluation is False, skipping this task 25675 1727204010.84110: _execute() done 25675 1727204010.84113: dumping result to json 25675 1727204010.84115: done dumping result, returning 25675 1727204010.84117: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-41bd-b19d-00000000004d] 25675 1727204010.84119: sending task result for task 028d2410-947f-41bd-b19d-00000000004d 25675 1727204010.84203: done sending task result for task 028d2410-947f-41bd-b19d-00000000004d 25675 1727204010.84208: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204010.84263: no more pending results, returning what we have 25675 1727204010.84267: results queue empty 25675 1727204010.84269: checking for any_errors_fatal 25675 1727204010.84281: done checking for any_errors_fatal 25675 1727204010.84288: checking for max_fail_percentage 25675 1727204010.84291: done checking for max_fail_percentage 25675 1727204010.84292: checking to see if all hosts have failed and the running result is not ok 25675 1727204010.84293: done checking to see if all hosts have failed 25675 1727204010.84294: getting the remaining hosts for this loop 25675 1727204010.84296: done getting the remaining hosts for this loop 25675 1727204010.84300: getting the next task for host managed-node2 25675 1727204010.84309: done getting next task for host managed-node2 25675 1727204010.84313: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204010.84317: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204010.84334: getting variables 25675 1727204010.84336: in VariableManager get_vars() 25675 1727204010.84487: Calling all_inventory to load vars for managed-node2 25675 1727204010.84490: Calling groups_inventory to load vars for managed-node2 25675 1727204010.84493: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204010.84504: Calling all_plugins_play to load vars for managed-node2 25675 1727204010.84507: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204010.84509: Calling groups_plugins_play to load vars for managed-node2 25675 1727204010.86898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204010.89138: done with get_vars() 25675 1727204010.89167: done getting variables 25675 1727204010.89230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:30 -0400 (0:00:00.097) 0:00:30.343 ***** 25675 1727204010.89265: entering _queue_task() for managed-node2/debug 25675 1727204010.89870: worker is 1 (out of 1 available) 25675 1727204010.89887: exiting _queue_task() for managed-node2/debug 25675 1727204010.89900: done queuing things up, now waiting for results queue to drain 25675 1727204010.89901: waiting for pending results... 25675 1727204010.90389: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204010.90884: in run() - task 028d2410-947f-41bd-b19d-00000000004e 25675 1727204010.90887: variable 'ansible_search_path' from source: unknown 25675 1727204010.90890: variable 'ansible_search_path' from source: unknown 25675 1727204010.90893: calling self._execute() 25675 1727204010.91020: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.91024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.91035: variable 'omit' from source: magic vars 25675 1727204010.91951: variable 'ansible_distribution_major_version' from source: facts 25675 1727204010.91961: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204010.91969: variable 'omit' from source: magic vars 25675 1727204010.92010: variable 'omit' from source: magic vars 25675 1727204010.92172: variable 'omit' from source: magic vars 25675 1727204010.92295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204010.92328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204010.92424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204010.92442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.92454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.92599: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204010.92603: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.92606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.92719: Set connection var ansible_shell_type to sh 25675 1727204010.92731: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204010.92741: Set connection var ansible_timeout to 10 25675 1727204010.92756: Set connection var ansible_pipelining to False 25675 1727204010.92767: Set connection var ansible_shell_executable to /bin/sh 25675 1727204010.92774: Set connection var ansible_connection to ssh 25675 1727204010.92817: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.92826: variable 'ansible_connection' from source: unknown 25675 1727204010.92858: variable 'ansible_module_compression' from source: unknown 25675 1727204010.92861: variable 'ansible_shell_type' from source: unknown 25675 1727204010.92864: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.92866: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.92868: variable 'ansible_pipelining' from source: unknown 25675 1727204010.92870: variable 'ansible_timeout' from source: unknown 25675 1727204010.92872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.93080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204010.93084: variable 'omit' from source: magic vars 25675 1727204010.93086: starting attempt loop 25675 1727204010.93088: running the handler 25675 1727204010.93221: variable '__network_connections_result' from source: set_fact 25675 1727204010.93273: handler run complete 25675 1727204010.93302: attempt loop complete, returning result 25675 1727204010.93308: _execute() done 25675 1727204010.93313: dumping result to json 25675 1727204010.93381: done dumping result, returning 25675 1727204010.93385: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-41bd-b19d-00000000004e] 25675 1727204010.93387: sending task result for task 028d2410-947f-41bd-b19d-00000000004e 25675 1727204010.93682: done sending task result for task 028d2410-947f-41bd-b19d-00000000004e 25675 1727204010.93686: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 25675 1727204010.93745: no more pending results, returning what we have 25675 1727204010.93748: results queue empty 25675 1727204010.93749: checking for any_errors_fatal 25675 1727204010.93754: done checking for any_errors_fatal 25675 1727204010.93755: checking for max_fail_percentage 25675 1727204010.93756: done checking for max_fail_percentage 25675 1727204010.93757: checking to see if all hosts have failed and the running result is not ok 25675 1727204010.93758: done checking to see if all hosts have failed 25675 1727204010.93759: getting the remaining hosts for this loop 25675 1727204010.93760: done getting the remaining hosts for this loop 25675 1727204010.93764: getting the next task for host managed-node2 25675 1727204010.93770: done getting next task for host managed-node2 25675 1727204010.93774: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204010.93780: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204010.93791: getting variables 25675 1727204010.93793: in VariableManager get_vars() 25675 1727204010.93829: Calling all_inventory to load vars for managed-node2 25675 1727204010.93833: Calling groups_inventory to load vars for managed-node2 25675 1727204010.93835: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204010.93844: Calling all_plugins_play to load vars for managed-node2 25675 1727204010.93847: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204010.93849: Calling groups_plugins_play to load vars for managed-node2 25675 1727204010.95713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204010.97354: done with get_vars() 25675 1727204010.97389: done getting variables 25675 1727204010.97451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:30 -0400 (0:00:00.082) 0:00:30.426 ***** 25675 1727204010.97487: entering _queue_task() for managed-node2/debug 25675 1727204010.97861: worker is 1 (out of 1 available) 25675 1727204010.98079: exiting _queue_task() for managed-node2/debug 25675 1727204010.98091: done queuing things up, now waiting for results queue to drain 25675 1727204010.98093: waiting for pending results... 25675 1727204010.98186: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204010.98296: in run() - task 028d2410-947f-41bd-b19d-00000000004f 25675 1727204010.98321: variable 'ansible_search_path' from source: unknown 25675 1727204010.98329: variable 'ansible_search_path' from source: unknown 25675 1727204010.98366: calling self._execute() 25675 1727204010.98458: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.98470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.98533: variable 'omit' from source: magic vars 25675 1727204010.98857: variable 'ansible_distribution_major_version' from source: facts 25675 1727204010.98879: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204010.98893: variable 'omit' from source: magic vars 25675 1727204010.98938: variable 'omit' from source: magic vars 25675 1727204010.98987: variable 'omit' from source: magic vars 25675 1727204010.99037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204010.99084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204010.99190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204010.99194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.99197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204010.99200: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204010.99202: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.99204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.99307: Set connection var ansible_shell_type to sh 25675 1727204010.99318: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204010.99328: Set connection var ansible_timeout to 10 25675 1727204010.99337: Set connection var ansible_pipelining to False 25675 1727204010.99346: Set connection var ansible_shell_executable to /bin/sh 25675 1727204010.99352: Set connection var ansible_connection to ssh 25675 1727204010.99387: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.99396: variable 'ansible_connection' from source: unknown 25675 1727204010.99408: variable 'ansible_module_compression' from source: unknown 25675 1727204010.99417: variable 'ansible_shell_type' from source: unknown 25675 1727204010.99423: variable 'ansible_shell_executable' from source: unknown 25675 1727204010.99430: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204010.99437: variable 'ansible_pipelining' from source: unknown 25675 1727204010.99446: variable 'ansible_timeout' from source: unknown 25675 1727204010.99454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204010.99625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204010.99628: variable 'omit' from source: magic vars 25675 1727204010.99639: starting attempt loop 25675 1727204010.99708: running the handler 25675 1727204010.99711: variable '__network_connections_result' from source: set_fact 25675 1727204010.99815: variable '__network_connections_result' from source: set_fact 25675 1727204010.99929: handler run complete 25675 1727204010.99963: attempt loop complete, returning result 25675 1727204010.99971: _execute() done 25675 1727204010.99980: dumping result to json 25675 1727204010.99990: done dumping result, returning 25675 1727204011.00003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-41bd-b19d-00000000004f] 25675 1727204011.00012: sending task result for task 028d2410-947f-41bd-b19d-00000000004f ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 25675 1727204011.00201: no more pending results, returning what we have 25675 1727204011.00205: results queue empty 25675 1727204011.00206: checking for any_errors_fatal 25675 1727204011.00214: done checking for any_errors_fatal 25675 1727204011.00215: checking for max_fail_percentage 25675 1727204011.00216: done checking for max_fail_percentage 25675 1727204011.00218: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.00219: done checking to see if all hosts have failed 25675 1727204011.00219: getting the remaining hosts for this loop 25675 1727204011.00221: done getting the remaining hosts for this loop 25675 1727204011.00225: getting the next task for host managed-node2 25675 1727204011.00232: done getting next task for host managed-node2 25675 1727204011.00236: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204011.00238: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.00250: getting variables 25675 1727204011.00252: in VariableManager get_vars() 25675 1727204011.00592: Calling all_inventory to load vars for managed-node2 25675 1727204011.00595: Calling groups_inventory to load vars for managed-node2 25675 1727204011.00597: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.00605: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.00607: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.00610: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.01291: done sending task result for task 028d2410-947f-41bd-b19d-00000000004f 25675 1727204011.01295: WORKER PROCESS EXITING 25675 1727204011.02115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.03693: done with get_vars() 25675 1727204011.03726: done getting variables 25675 1727204011.03792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:31 -0400 (0:00:00.063) 0:00:30.489 ***** 25675 1727204011.03829: entering _queue_task() for managed-node2/debug 25675 1727204011.04194: worker is 1 (out of 1 available) 25675 1727204011.04206: exiting _queue_task() for managed-node2/debug 25675 1727204011.04218: done queuing things up, now waiting for results queue to drain 25675 1727204011.04219: waiting for pending results... 25675 1727204011.04493: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204011.04601: in run() - task 028d2410-947f-41bd-b19d-000000000050 25675 1727204011.04625: variable 'ansible_search_path' from source: unknown 25675 1727204011.04631: variable 'ansible_search_path' from source: unknown 25675 1727204011.04670: calling self._execute() 25675 1727204011.04771: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.04785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.04798: variable 'omit' from source: magic vars 25675 1727204011.05189: variable 'ansible_distribution_major_version' from source: facts 25675 1727204011.05208: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204011.05346: variable 'network_state' from source: role '' defaults 25675 1727204011.05363: Evaluated conditional (network_state != {}): False 25675 1727204011.05377: when evaluation is False, skipping this task 25675 1727204011.05386: _execute() done 25675 1727204011.05395: dumping result to json 25675 1727204011.05404: done dumping result, returning 25675 1727204011.05415: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-41bd-b19d-000000000050] 25675 1727204011.05426: sending task result for task 028d2410-947f-41bd-b19d-000000000050 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 25675 1727204011.05580: no more pending results, returning what we have 25675 1727204011.05584: results queue empty 25675 1727204011.05585: checking for any_errors_fatal 25675 1727204011.05596: done checking for any_errors_fatal 25675 1727204011.05597: checking for max_fail_percentage 25675 1727204011.05598: done checking for max_fail_percentage 25675 1727204011.05600: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.05600: done checking to see if all hosts have failed 25675 1727204011.05601: getting the remaining hosts for this loop 25675 1727204011.05603: done getting the remaining hosts for this loop 25675 1727204011.05607: getting the next task for host managed-node2 25675 1727204011.05613: done getting next task for host managed-node2 25675 1727204011.05618: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204011.05621: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.05637: getting variables 25675 1727204011.05639: in VariableManager get_vars() 25675 1727204011.05789: Calling all_inventory to load vars for managed-node2 25675 1727204011.05792: Calling groups_inventory to load vars for managed-node2 25675 1727204011.05795: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.05802: done sending task result for task 028d2410-947f-41bd-b19d-000000000050 25675 1727204011.05805: WORKER PROCESS EXITING 25675 1727204011.05818: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.05821: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.05825: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.07659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.10410: done with get_vars() 25675 1727204011.10444: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:31 -0400 (0:00:00.068) 0:00:30.557 ***** 25675 1727204011.10651: entering _queue_task() for managed-node2/ping 25675 1727204011.11424: worker is 1 (out of 1 available) 25675 1727204011.11439: exiting _queue_task() for managed-node2/ping 25675 1727204011.11450: done queuing things up, now waiting for results queue to drain 25675 1727204011.11452: waiting for pending results... 25675 1727204011.11904: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204011.12154: in run() - task 028d2410-947f-41bd-b19d-000000000051 25675 1727204011.12178: variable 'ansible_search_path' from source: unknown 25675 1727204011.12189: variable 'ansible_search_path' from source: unknown 25675 1727204011.12283: calling self._execute() 25675 1727204011.12398: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.12409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.12424: variable 'omit' from source: magic vars 25675 1727204011.13095: variable 'ansible_distribution_major_version' from source: facts 25675 1727204011.13114: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204011.13128: variable 'omit' from source: magic vars 25675 1727204011.13331: variable 'omit' from source: magic vars 25675 1727204011.13334: variable 'omit' from source: magic vars 25675 1727204011.13345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204011.13393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204011.13416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204011.13435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204011.13449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204011.13486: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204011.13496: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.13504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.13703: Set connection var ansible_shell_type to sh 25675 1727204011.13715: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204011.13725: Set connection var ansible_timeout to 10 25675 1727204011.13733: Set connection var ansible_pipelining to False 25675 1727204011.13742: Set connection var ansible_shell_executable to /bin/sh 25675 1727204011.13748: Set connection var ansible_connection to ssh 25675 1727204011.13782: variable 'ansible_shell_executable' from source: unknown 25675 1727204011.13791: variable 'ansible_connection' from source: unknown 25675 1727204011.13798: variable 'ansible_module_compression' from source: unknown 25675 1727204011.13881: variable 'ansible_shell_type' from source: unknown 25675 1727204011.13884: variable 'ansible_shell_executable' from source: unknown 25675 1727204011.13886: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.13888: variable 'ansible_pipelining' from source: unknown 25675 1727204011.13890: variable 'ansible_timeout' from source: unknown 25675 1727204011.13892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.14038: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204011.14180: variable 'omit' from source: magic vars 25675 1727204011.14183: starting attempt loop 25675 1727204011.14185: running the handler 25675 1727204011.14187: _low_level_execute_command(): starting 25675 1727204011.14189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204011.14931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.14947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.14991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.15012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.15025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204011.15110: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.15129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.15245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.16966: stdout chunk (state=3): >>>/root <<< 25675 1727204011.17170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.17174: stdout chunk (state=3): >>><<< 25675 1727204011.17178: stderr chunk (state=3): >>><<< 25675 1727204011.17193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.17215: _low_level_execute_command(): starting 25675 1727204011.17227: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829 `" && echo ansible-tmp-1727204011.1720068-28341-65151151695829="` echo /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829 `" ) && sleep 0' 25675 1727204011.18796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.18801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.18804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.18806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.18809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204011.18822: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204011.18825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204011.18827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204011.18829: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204011.18832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204011.18834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.18836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.18838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.18902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204011.18906: stderr chunk (state=3): >>>debug2: match found <<< 25675 1727204011.19059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.19070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.19493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.21368: stdout chunk (state=3): >>>ansible-tmp-1727204011.1720068-28341-65151151695829=/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829 <<< 25675 1727204011.21472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.21569: stderr chunk (state=3): >>><<< 25675 1727204011.21572: stdout chunk (state=3): >>><<< 25675 1727204011.21596: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204011.1720068-28341-65151151695829=/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.21646: variable 'ansible_module_compression' from source: unknown 25675 1727204011.21694: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 25675 1727204011.21732: variable 'ansible_facts' from source: unknown 25675 1727204011.22071: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py 25675 1727204011.22457: Sending initial data 25675 1727204011.22460: Sent initial data (152 bytes) 25675 1727204011.24025: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.24032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.24142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.25883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204011.25980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204011.26021: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmprqoyvv2r /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py <<< 25675 1727204011.26025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py" <<< 25675 1727204011.26095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmprqoyvv2r" to remote "/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py" <<< 25675 1727204011.28051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.28187: stderr chunk (state=3): >>><<< 25675 1727204011.28190: stdout chunk (state=3): >>><<< 25675 1727204011.28193: done transferring module to remote 25675 1727204011.28198: _low_level_execute_command(): starting 25675 1727204011.28208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/ /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py && sleep 0' 25675 1727204011.29814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204011.29934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204011.29960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.29988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.30156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.32080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.32085: stdout chunk (state=3): >>><<< 25675 1727204011.32087: stderr chunk (state=3): >>><<< 25675 1727204011.32204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.32208: _low_level_execute_command(): starting 25675 1727204011.32210: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/AnsiballZ_ping.py && sleep 0' 25675 1727204011.33354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.33357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.33381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.33391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.33884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.33887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.34007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.48899: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25675 1727204011.50250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204011.50256: stdout chunk (state=3): >>><<< 25675 1727204011.50259: stderr chunk (state=3): >>><<< 25675 1727204011.50303: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204011.50328: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204011.50338: _low_level_execute_command(): starting 25675 1727204011.50343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204011.1720068-28341-65151151695829/ > /dev/null 2>&1 && sleep 0' 25675 1727204011.51531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.51597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.51686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.51690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.51782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.51918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.52207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.54165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.54169: stdout chunk (state=3): >>><<< 25675 1727204011.54172: stderr chunk (state=3): >>><<< 25675 1727204011.54273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.54281: handler run complete 25675 1727204011.54284: attempt loop complete, returning result 25675 1727204011.54285: _execute() done 25675 1727204011.54287: dumping result to json 25675 1727204011.54289: done dumping result, returning 25675 1727204011.54290: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-41bd-b19d-000000000051] 25675 1727204011.54292: sending task result for task 028d2410-947f-41bd-b19d-000000000051 25675 1727204011.54351: done sending task result for task 028d2410-947f-41bd-b19d-000000000051 25675 1727204011.54353: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 25675 1727204011.54436: no more pending results, returning what we have 25675 1727204011.54440: results queue empty 25675 1727204011.54441: checking for any_errors_fatal 25675 1727204011.54448: done checking for any_errors_fatal 25675 1727204011.54449: checking for max_fail_percentage 25675 1727204011.54450: done checking for max_fail_percentage 25675 1727204011.54452: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.54453: done checking to see if all hosts have failed 25675 1727204011.54453: getting the remaining hosts for this loop 25675 1727204011.54455: done getting the remaining hosts for this loop 25675 1727204011.54459: getting the next task for host managed-node2 25675 1727204011.54467: done getting next task for host managed-node2 25675 1727204011.54469: ^ task is: TASK: meta (role_complete) 25675 1727204011.54471: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.54487: getting variables 25675 1727204011.54490: in VariableManager get_vars() 25675 1727204011.54531: Calling all_inventory to load vars for managed-node2 25675 1727204011.54535: Calling groups_inventory to load vars for managed-node2 25675 1727204011.54537: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.54550: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.54553: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.54556: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.56844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.58667: done with get_vars() 25675 1727204011.58700: done getting variables 25675 1727204011.58780: done queuing things up, now waiting for results queue to drain 25675 1727204011.58783: results queue empty 25675 1727204011.58783: checking for any_errors_fatal 25675 1727204011.58786: done checking for any_errors_fatal 25675 1727204011.58787: checking for max_fail_percentage 25675 1727204011.58788: done checking for max_fail_percentage 25675 1727204011.58789: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.58790: done checking to see if all hosts have failed 25675 1727204011.58790: getting the remaining hosts for this loop 25675 1727204011.58791: done getting the remaining hosts for this loop 25675 1727204011.58794: getting the next task for host managed-node2 25675 1727204011.58797: done getting next task for host managed-node2 25675 1727204011.58799: ^ task is: TASK: meta (flush_handlers) 25675 1727204011.58800: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.58803: getting variables 25675 1727204011.58804: in VariableManager get_vars() 25675 1727204011.58815: Calling all_inventory to load vars for managed-node2 25675 1727204011.58817: Calling groups_inventory to load vars for managed-node2 25675 1727204011.58819: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.58823: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.58825: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.58828: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.60487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.62112: done with get_vars() 25675 1727204011.62141: done getting variables 25675 1727204011.62208: in VariableManager get_vars() 25675 1727204011.62221: Calling all_inventory to load vars for managed-node2 25675 1727204011.62223: Calling groups_inventory to load vars for managed-node2 25675 1727204011.62225: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.62231: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.62233: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.62236: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.63383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.64980: done with get_vars() 25675 1727204011.65009: done queuing things up, now waiting for results queue to drain 25675 1727204011.65012: results queue empty 25675 1727204011.65013: checking for any_errors_fatal 25675 1727204011.65014: done checking for any_errors_fatal 25675 1727204011.65015: checking for max_fail_percentage 25675 1727204011.65016: done checking for max_fail_percentage 25675 1727204011.65016: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.65017: done checking to see if all hosts have failed 25675 1727204011.65018: getting the remaining hosts for this loop 25675 1727204011.65019: done getting the remaining hosts for this loop 25675 1727204011.65022: getting the next task for host managed-node2 25675 1727204011.65025: done getting next task for host managed-node2 25675 1727204011.65027: ^ task is: TASK: meta (flush_handlers) 25675 1727204011.65028: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.65032: getting variables 25675 1727204011.65033: in VariableManager get_vars() 25675 1727204011.65051: Calling all_inventory to load vars for managed-node2 25675 1727204011.65059: Calling groups_inventory to load vars for managed-node2 25675 1727204011.65066: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.65094: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.65107: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.65111: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.66332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.67904: done with get_vars() 25675 1727204011.67929: done getting variables 25675 1727204011.67982: in VariableManager get_vars() 25675 1727204011.67996: Calling all_inventory to load vars for managed-node2 25675 1727204011.67998: Calling groups_inventory to load vars for managed-node2 25675 1727204011.68000: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.68005: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.68007: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.68010: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.69141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.70951: done with get_vars() 25675 1727204011.70980: done queuing things up, now waiting for results queue to drain 25675 1727204011.70983: results queue empty 25675 1727204011.70983: checking for any_errors_fatal 25675 1727204011.70985: done checking for any_errors_fatal 25675 1727204011.70985: checking for max_fail_percentage 25675 1727204011.70986: done checking for max_fail_percentage 25675 1727204011.70987: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.70988: done checking to see if all hosts have failed 25675 1727204011.70989: getting the remaining hosts for this loop 25675 1727204011.70990: done getting the remaining hosts for this loop 25675 1727204011.70992: getting the next task for host managed-node2 25675 1727204011.70995: done getting next task for host managed-node2 25675 1727204011.70996: ^ task is: None 25675 1727204011.70998: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.70999: done queuing things up, now waiting for results queue to drain 25675 1727204011.70999: results queue empty 25675 1727204011.71000: checking for any_errors_fatal 25675 1727204011.71001: done checking for any_errors_fatal 25675 1727204011.71001: checking for max_fail_percentage 25675 1727204011.71002: done checking for max_fail_percentage 25675 1727204011.71003: checking to see if all hosts have failed and the running result is not ok 25675 1727204011.71003: done checking to see if all hosts have failed 25675 1727204011.71004: getting the next task for host managed-node2 25675 1727204011.71006: done getting next task for host managed-node2 25675 1727204011.71007: ^ task is: None 25675 1727204011.71008: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.71147: in VariableManager get_vars() 25675 1727204011.71163: done with get_vars() 25675 1727204011.71169: in VariableManager get_vars() 25675 1727204011.71183: done with get_vars() 25675 1727204011.71189: variable 'omit' from source: magic vars 25675 1727204011.71225: in VariableManager get_vars() 25675 1727204011.71235: done with get_vars() 25675 1727204011.71255: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 25675 1727204011.71504: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204011.71526: getting the remaining hosts for this loop 25675 1727204011.71527: done getting the remaining hosts for this loop 25675 1727204011.71530: getting the next task for host managed-node2 25675 1727204011.71536: done getting next task for host managed-node2 25675 1727204011.71538: ^ task is: TASK: Gathering Facts 25675 1727204011.71540: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204011.71542: getting variables 25675 1727204011.71543: in VariableManager get_vars() 25675 1727204011.71551: Calling all_inventory to load vars for managed-node2 25675 1727204011.71552: Calling groups_inventory to load vars for managed-node2 25675 1727204011.71554: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204011.71559: Calling all_plugins_play to load vars for managed-node2 25675 1727204011.71561: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204011.71563: Calling groups_plugins_play to load vars for managed-node2 25675 1727204011.72851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204011.74478: done with get_vars() 25675 1727204011.74507: done getting variables 25675 1727204011.74549: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:53:31 -0400 (0:00:00.639) 0:00:31.196 ***** 25675 1727204011.74572: entering _queue_task() for managed-node2/gather_facts 25675 1727204011.74939: worker is 1 (out of 1 available) 25675 1727204011.74951: exiting _queue_task() for managed-node2/gather_facts 25675 1727204011.74962: done queuing things up, now waiting for results queue to drain 25675 1727204011.74964: waiting for pending results... 25675 1727204011.75265: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204011.75331: in run() - task 028d2410-947f-41bd-b19d-0000000003f8 25675 1727204011.75367: variable 'ansible_search_path' from source: unknown 25675 1727204011.75424: calling self._execute() 25675 1727204011.75533: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.75545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.75558: variable 'omit' from source: magic vars 25675 1727204011.76381: variable 'ansible_distribution_major_version' from source: facts 25675 1727204011.76384: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204011.76387: variable 'omit' from source: magic vars 25675 1727204011.76389: variable 'omit' from source: magic vars 25675 1727204011.76392: variable 'omit' from source: magic vars 25675 1727204011.76394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204011.76396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204011.76418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204011.76439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204011.76457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204011.76494: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204011.76503: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.76511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.76614: Set connection var ansible_shell_type to sh 25675 1727204011.76626: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204011.76635: Set connection var ansible_timeout to 10 25675 1727204011.76645: Set connection var ansible_pipelining to False 25675 1727204011.76656: Set connection var ansible_shell_executable to /bin/sh 25675 1727204011.76663: Set connection var ansible_connection to ssh 25675 1727204011.76698: variable 'ansible_shell_executable' from source: unknown 25675 1727204011.76705: variable 'ansible_connection' from source: unknown 25675 1727204011.76711: variable 'ansible_module_compression' from source: unknown 25675 1727204011.76716: variable 'ansible_shell_type' from source: unknown 25675 1727204011.76721: variable 'ansible_shell_executable' from source: unknown 25675 1727204011.76725: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204011.76731: variable 'ansible_pipelining' from source: unknown 25675 1727204011.76736: variable 'ansible_timeout' from source: unknown 25675 1727204011.76742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204011.77029: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204011.77039: variable 'omit' from source: magic vars 25675 1727204011.77044: starting attempt loop 25675 1727204011.77047: running the handler 25675 1727204011.77067: variable 'ansible_facts' from source: unknown 25675 1727204011.77098: _low_level_execute_command(): starting 25675 1727204011.77115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204011.77901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204011.77919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.77944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.78055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.79756: stdout chunk (state=3): >>>/root <<< 25675 1727204011.79893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.79905: stdout chunk (state=3): >>><<< 25675 1727204011.79921: stderr chunk (state=3): >>><<< 25675 1727204011.79962: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.80004: _low_level_execute_command(): starting 25675 1727204011.80182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561 `" && echo ansible-tmp-1727204011.799906-28428-205671870160561="` echo /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561 `" ) && sleep 0' 25675 1727204011.80873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.80879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204011.80882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.80892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204011.80936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.80961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.81073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.82996: stdout chunk (state=3): >>>ansible-tmp-1727204011.799906-28428-205671870160561=/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561 <<< 25675 1727204011.83143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.83165: stdout chunk (state=3): >>><<< 25675 1727204011.83168: stderr chunk (state=3): >>><<< 25675 1727204011.83384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204011.799906-28428-205671870160561=/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.83388: variable 'ansible_module_compression' from source: unknown 25675 1727204011.83391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204011.83393: variable 'ansible_facts' from source: unknown 25675 1727204011.83566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py 25675 1727204011.83839: Sending initial data 25675 1727204011.83848: Sent initial data (153 bytes) 25675 1727204011.84685: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.84696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.84789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.84813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.84924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.86606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204011.86674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204011.86759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5d2ar3j_ /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py <<< 25675 1727204011.86762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py" <<< 25675 1727204011.86827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5d2ar3j_" to remote "/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py" <<< 25675 1727204011.89674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.89746: stderr chunk (state=3): >>><<< 25675 1727204011.89756: stdout chunk (state=3): >>><<< 25675 1727204011.89784: done transferring module to remote 25675 1727204011.89787: _low_level_execute_command(): starting 25675 1727204011.89794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/ /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py && sleep 0' 25675 1727204011.91129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.91364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204011.91696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.91701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204011.93426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204011.93461: stderr chunk (state=3): >>><<< 25675 1727204011.93610: stdout chunk (state=3): >>><<< 25675 1727204011.93688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204011.93692: _low_level_execute_command(): starting 25675 1727204011.93695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/AnsiballZ_setup.py && sleep 0' 25675 1727204011.94603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204011.94736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204011.94763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204011.94796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204011.94831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204011.94991: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204011.95040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204011.95071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204011.95095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204011.95213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204012.61718: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2913, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 618, "free": 2913}, "nocache": {"free": 3270, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785743360, "block_size": 4096, "block_total": 65519099, "block_available": 63912535, "block_used": 1606564, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_<<< 25675 1727204012.61742: stdout chunk (state=3): >>>user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.49462890625, "5m": 0.431640625, "15m": 0.23046875}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "32", "epoch": "1727204012", "epoch_int": "1727204012", "date": "2024-09-24", "time": "14:53:32", "iso8601_micro": "2024-09-24T18:53:32.551489Z", "iso8601": "2024-09-24T18:53:32Z", "iso8601_basic": "20240924T145332551489", "iso8601_basic_short": "20240924T145332", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["peerlsr27", "lsr27", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204012.63519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204012.63558: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 25675 1727204012.63592: stderr chunk (state=3): >>><<< 25675 1727204012.63704: stdout chunk (state=3): >>><<< 25675 1727204012.63746: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2913, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 618, "free": 2913}, "nocache": {"free": 3270, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785743360, "block_size": 4096, "block_total": 65519099, "block_available": 63912535, "block_used": 1606564, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.49462890625, "5m": 0.431640625, "15m": 0.23046875}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "32", "epoch": "1727204012", "epoch_int": "1727204012", "date": "2024-09-24", "time": "14:53:32", "iso8601_micro": "2024-09-24T18:53:32.551489Z", "iso8601": "2024-09-24T18:53:32Z", "iso8601_basic": "20240924T145332551489", "iso8601_basic_short": "20240924T145332", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["peerlsr27", "lsr27", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "f2:f7:d0:d6:86:b9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f0f7:d0ff:fed6:86b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "12:fe:e3:2a:a8:0d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::10fe:e3ff:fe2a:a80d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::f0f7:d0ff:fed6:86b9", "fe80::10fe:e3ff:fe2a:a80d", "fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d", "fe80::10fe:e3ff:fe2a:a80d", "fe80::f0f7:d0ff:fed6:86b9"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204012.64704: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204012.64742: _low_level_execute_command(): starting 25675 1727204012.64754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204011.799906-28428-205671870160561/ > /dev/null 2>&1 && sleep 0' 25675 1727204012.66021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204012.66090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204012.66250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204012.66267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204012.66325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204012.68238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204012.68259: stderr chunk (state=3): >>><<< 25675 1727204012.68266: stdout chunk (state=3): >>><<< 25675 1727204012.68299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204012.68333: handler run complete 25675 1727204012.68478: variable 'ansible_facts' from source: unknown 25675 1727204012.68649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.68959: variable 'ansible_facts' from source: unknown 25675 1727204012.69060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.69232: attempt loop complete, returning result 25675 1727204012.69242: _execute() done 25675 1727204012.69249: dumping result to json 25675 1727204012.69296: done dumping result, returning 25675 1727204012.69312: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-0000000003f8] 25675 1727204012.69323: sending task result for task 028d2410-947f-41bd-b19d-0000000003f8 25675 1727204012.70446: done sending task result for task 028d2410-947f-41bd-b19d-0000000003f8 25675 1727204012.70449: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727204012.70824: no more pending results, returning what we have 25675 1727204012.70827: results queue empty 25675 1727204012.70828: checking for any_errors_fatal 25675 1727204012.70829: done checking for any_errors_fatal 25675 1727204012.70829: checking for max_fail_percentage 25675 1727204012.70831: done checking for max_fail_percentage 25675 1727204012.70832: checking to see if all hosts have failed and the running result is not ok 25675 1727204012.70833: done checking to see if all hosts have failed 25675 1727204012.70833: getting the remaining hosts for this loop 25675 1727204012.70834: done getting the remaining hosts for this loop 25675 1727204012.70837: getting the next task for host managed-node2 25675 1727204012.70842: done getting next task for host managed-node2 25675 1727204012.70844: ^ task is: TASK: meta (flush_handlers) 25675 1727204012.70846: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204012.70849: getting variables 25675 1727204012.70850: in VariableManager get_vars() 25675 1727204012.70871: Calling all_inventory to load vars for managed-node2 25675 1727204012.70873: Calling groups_inventory to load vars for managed-node2 25675 1727204012.71132: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204012.71145: Calling all_plugins_play to load vars for managed-node2 25675 1727204012.71148: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204012.71152: Calling groups_plugins_play to load vars for managed-node2 25675 1727204012.73065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.74741: done with get_vars() 25675 1727204012.74767: done getting variables 25675 1727204012.74839: in VariableManager get_vars() 25675 1727204012.74851: Calling all_inventory to load vars for managed-node2 25675 1727204012.74853: Calling groups_inventory to load vars for managed-node2 25675 1727204012.74855: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204012.74861: Calling all_plugins_play to load vars for managed-node2 25675 1727204012.74863: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204012.74866: Calling groups_plugins_play to load vars for managed-node2 25675 1727204012.76729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.78589: done with get_vars() 25675 1727204012.78618: done queuing things up, now waiting for results queue to drain 25675 1727204012.78621: results queue empty 25675 1727204012.78622: checking for any_errors_fatal 25675 1727204012.78626: done checking for any_errors_fatal 25675 1727204012.78627: checking for max_fail_percentage 25675 1727204012.78629: done checking for max_fail_percentage 25675 1727204012.78629: checking to see if all hosts have failed and the running result is not ok 25675 1727204012.78630: done checking to see if all hosts have failed 25675 1727204012.78635: getting the remaining hosts for this loop 25675 1727204012.78637: done getting the remaining hosts for this loop 25675 1727204012.78640: getting the next task for host managed-node2 25675 1727204012.78644: done getting next task for host managed-node2 25675 1727204012.78646: ^ task is: TASK: Include the task 'delete_interface.yml' 25675 1727204012.78648: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204012.78651: getting variables 25675 1727204012.78652: in VariableManager get_vars() 25675 1727204012.78662: Calling all_inventory to load vars for managed-node2 25675 1727204012.78665: Calling groups_inventory to load vars for managed-node2 25675 1727204012.78667: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204012.78673: Calling all_plugins_play to load vars for managed-node2 25675 1727204012.78677: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204012.78681: Calling groups_plugins_play to load vars for managed-node2 25675 1727204012.84226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.85742: done with get_vars() 25675 1727204012.85767: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:53:32 -0400 (0:00:01.112) 0:00:32.309 ***** 25675 1727204012.85840: entering _queue_task() for managed-node2/include_tasks 25675 1727204012.86251: worker is 1 (out of 1 available) 25675 1727204012.86265: exiting _queue_task() for managed-node2/include_tasks 25675 1727204012.86283: done queuing things up, now waiting for results queue to drain 25675 1727204012.86285: waiting for pending results... 25675 1727204012.86526: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 25675 1727204012.86688: in run() - task 028d2410-947f-41bd-b19d-000000000054 25675 1727204012.86711: variable 'ansible_search_path' from source: unknown 25675 1727204012.86773: calling self._execute() 25675 1727204012.86899: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204012.86920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204012.86943: variable 'omit' from source: magic vars 25675 1727204012.87608: variable 'ansible_distribution_major_version' from source: facts 25675 1727204012.87625: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204012.87642: _execute() done 25675 1727204012.87654: dumping result to json 25675 1727204012.87657: done dumping result, returning 25675 1727204012.87663: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [028d2410-947f-41bd-b19d-000000000054] 25675 1727204012.87673: sending task result for task 028d2410-947f-41bd-b19d-000000000054 25675 1727204012.87818: no more pending results, returning what we have 25675 1727204012.87824: in VariableManager get_vars() 25675 1727204012.87859: Calling all_inventory to load vars for managed-node2 25675 1727204012.87863: Calling groups_inventory to load vars for managed-node2 25675 1727204012.87867: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204012.87884: Calling all_plugins_play to load vars for managed-node2 25675 1727204012.87888: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204012.87891: Calling groups_plugins_play to load vars for managed-node2 25675 1727204012.88790: done sending task result for task 028d2410-947f-41bd-b19d-000000000054 25675 1727204012.88794: WORKER PROCESS EXITING 25675 1727204012.89555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.91249: done with get_vars() 25675 1727204012.91268: variable 'ansible_search_path' from source: unknown 25675 1727204012.91286: we have included files to process 25675 1727204012.91287: generating all_blocks data 25675 1727204012.91288: done generating all_blocks data 25675 1727204012.91289: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 25675 1727204012.91290: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 25675 1727204012.91292: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 25675 1727204012.91506: done processing included file 25675 1727204012.91508: iterating over new_blocks loaded from include file 25675 1727204012.91510: in VariableManager get_vars() 25675 1727204012.91520: done with get_vars() 25675 1727204012.91523: filtering new block on tags 25675 1727204012.91537: done filtering new block on tags 25675 1727204012.91540: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 25675 1727204012.91545: extending task lists for all hosts with included blocks 25675 1727204012.91578: done extending task lists 25675 1727204012.91579: done processing included files 25675 1727204012.91580: results queue empty 25675 1727204012.91581: checking for any_errors_fatal 25675 1727204012.91583: done checking for any_errors_fatal 25675 1727204012.91584: checking for max_fail_percentage 25675 1727204012.91585: done checking for max_fail_percentage 25675 1727204012.91586: checking to see if all hosts have failed and the running result is not ok 25675 1727204012.91587: done checking to see if all hosts have failed 25675 1727204012.91588: getting the remaining hosts for this loop 25675 1727204012.91589: done getting the remaining hosts for this loop 25675 1727204012.91592: getting the next task for host managed-node2 25675 1727204012.91596: done getting next task for host managed-node2 25675 1727204012.91598: ^ task is: TASK: Remove test interface if necessary 25675 1727204012.91600: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204012.91602: getting variables 25675 1727204012.91603: in VariableManager get_vars() 25675 1727204012.91612: Calling all_inventory to load vars for managed-node2 25675 1727204012.91615: Calling groups_inventory to load vars for managed-node2 25675 1727204012.91617: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204012.91623: Calling all_plugins_play to load vars for managed-node2 25675 1727204012.91626: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204012.91628: Calling groups_plugins_play to load vars for managed-node2 25675 1727204012.92848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204012.96037: done with get_vars() 25675 1727204012.96070: done getting variables 25675 1727204012.96189: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:53:32 -0400 (0:00:00.103) 0:00:32.413 ***** 25675 1727204012.96224: entering _queue_task() for managed-node2/command 25675 1727204012.96591: worker is 1 (out of 1 available) 25675 1727204012.96603: exiting _queue_task() for managed-node2/command 25675 1727204012.96615: done queuing things up, now waiting for results queue to drain 25675 1727204012.96617: waiting for pending results... 25675 1727204012.96899: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 25675 1727204012.97042: in run() - task 028d2410-947f-41bd-b19d-000000000409 25675 1727204012.97064: variable 'ansible_search_path' from source: unknown 25675 1727204012.97074: variable 'ansible_search_path' from source: unknown 25675 1727204012.97123: calling self._execute() 25675 1727204012.97224: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204012.97348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204012.97352: variable 'omit' from source: magic vars 25675 1727204012.97653: variable 'ansible_distribution_major_version' from source: facts 25675 1727204012.97673: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204012.97690: variable 'omit' from source: magic vars 25675 1727204012.97734: variable 'omit' from source: magic vars 25675 1727204012.97842: variable 'interface' from source: set_fact 25675 1727204012.97867: variable 'omit' from source: magic vars 25675 1727204012.97923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204012.97960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204012.97987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204012.98015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204012.98115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204012.98119: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204012.98121: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204012.98123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204012.98183: Set connection var ansible_shell_type to sh 25675 1727204012.98193: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204012.98201: Set connection var ansible_timeout to 10 25675 1727204012.98209: Set connection var ansible_pipelining to False 25675 1727204012.98219: Set connection var ansible_shell_executable to /bin/sh 25675 1727204012.98227: Set connection var ansible_connection to ssh 25675 1727204012.98334: variable 'ansible_shell_executable' from source: unknown 25675 1727204012.98338: variable 'ansible_connection' from source: unknown 25675 1727204012.98340: variable 'ansible_module_compression' from source: unknown 25675 1727204012.98342: variable 'ansible_shell_type' from source: unknown 25675 1727204012.98344: variable 'ansible_shell_executable' from source: unknown 25675 1727204012.98346: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204012.98348: variable 'ansible_pipelining' from source: unknown 25675 1727204012.98350: variable 'ansible_timeout' from source: unknown 25675 1727204012.98352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204012.98458: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204012.98472: variable 'omit' from source: magic vars 25675 1727204012.98486: starting attempt loop 25675 1727204012.98494: running the handler 25675 1727204012.98516: _low_level_execute_command(): starting 25675 1727204012.98531: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204012.99521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204012.99593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204012.99630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204012.99673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204012.99679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204012.99855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.01563: stdout chunk (state=3): >>>/root <<< 25675 1727204013.01783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.01787: stdout chunk (state=3): >>><<< 25675 1727204013.01790: stderr chunk (state=3): >>><<< 25675 1727204013.01814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.01884: _low_level_execute_command(): starting 25675 1727204013.01888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993 `" && echo ansible-tmp-1727204013.0182478-28484-61449435501993="` echo /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993 `" ) && sleep 0' 25675 1727204013.02488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.02504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.02519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204013.02542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204013.02560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204013.02585: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204013.02657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.02695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204013.02712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.02726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.02838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.04801: stdout chunk (state=3): >>>ansible-tmp-1727204013.0182478-28484-61449435501993=/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993 <<< 25675 1727204013.05051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.05079: stdout chunk (state=3): >>><<< 25675 1727204013.05084: stderr chunk (state=3): >>><<< 25675 1727204013.05196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204013.0182478-28484-61449435501993=/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.05201: variable 'ansible_module_compression' from source: unknown 25675 1727204013.05210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727204013.05255: variable 'ansible_facts' from source: unknown 25675 1727204013.05342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py 25675 1727204013.05581: Sending initial data 25675 1727204013.05585: Sent initial data (155 bytes) 25675 1727204013.06179: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.06209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.06311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.07889: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25675 1727204013.07910: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204013.08002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204013.08089: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp3_rz86hz /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py <<< 25675 1727204013.08092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py" <<< 25675 1727204013.08156: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp3_rz86hz" to remote "/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py" <<< 25675 1727204013.09499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.09502: stdout chunk (state=3): >>><<< 25675 1727204013.09504: stderr chunk (state=3): >>><<< 25675 1727204013.09506: done transferring module to remote 25675 1727204013.09508: _low_level_execute_command(): starting 25675 1727204013.09510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/ /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py && sleep 0' 25675 1727204013.10072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204013.10092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204013.10184: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.10209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204013.10233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.10277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.10338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.12156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.12195: stdout chunk (state=3): >>><<< 25675 1727204013.12199: stderr chunk (state=3): >>><<< 25675 1727204013.12293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.12299: _low_level_execute_command(): starting 25675 1727204013.12302: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/AnsiballZ_command.py && sleep 0' 25675 1727204013.13067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.13250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.13265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.13377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.30023: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:53:33.283634", "end": "2024-09-24 14:53:33.294786", "delta": "0:00:00.011152", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727204013.32247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204013.32262: stdout chunk (state=3): >>><<< 25675 1727204013.32279: stderr chunk (state=3): >>><<< 25675 1727204013.32307: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:53:33.283634", "end": "2024-09-24 14:53:33.294786", "delta": "0:00:00.011152", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204013.32429: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204013.32432: _low_level_execute_command(): starting 25675 1727204013.32435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204013.0182478-28484-61449435501993/ > /dev/null 2>&1 && sleep 0' 25675 1727204013.32988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.33005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.33021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204013.33093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.33152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204013.33171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.33238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.33324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.35229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.35233: stdout chunk (state=3): >>><<< 25675 1727204013.35236: stderr chunk (state=3): >>><<< 25675 1727204013.35381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.35385: handler run complete 25675 1727204013.35388: Evaluated conditional (False): False 25675 1727204013.35390: attempt loop complete, returning result 25675 1727204013.35392: _execute() done 25675 1727204013.35394: dumping result to json 25675 1727204013.35396: done dumping result, returning 25675 1727204013.35398: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [028d2410-947f-41bd-b19d-000000000409] 25675 1727204013.35400: sending task result for task 028d2410-947f-41bd-b19d-000000000409 25675 1727204013.35474: done sending task result for task 028d2410-947f-41bd-b19d-000000000409 ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.011152", "end": "2024-09-24 14:53:33.294786", "rc": 0, "start": "2024-09-24 14:53:33.283634" } 25675 1727204013.35559: no more pending results, returning what we have 25675 1727204013.35563: results queue empty 25675 1727204013.35564: checking for any_errors_fatal 25675 1727204013.35565: done checking for any_errors_fatal 25675 1727204013.35566: checking for max_fail_percentage 25675 1727204013.35568: done checking for max_fail_percentage 25675 1727204013.35569: checking to see if all hosts have failed and the running result is not ok 25675 1727204013.35570: done checking to see if all hosts have failed 25675 1727204013.35570: getting the remaining hosts for this loop 25675 1727204013.35572: done getting the remaining hosts for this loop 25675 1727204013.35579: getting the next task for host managed-node2 25675 1727204013.35591: done getting next task for host managed-node2 25675 1727204013.35593: ^ task is: TASK: meta (flush_handlers) 25675 1727204013.35595: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204013.35601: getting variables 25675 1727204013.35603: in VariableManager get_vars() 25675 1727204013.35636: Calling all_inventory to load vars for managed-node2 25675 1727204013.35639: Calling groups_inventory to load vars for managed-node2 25675 1727204013.35643: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204013.35654: Calling all_plugins_play to load vars for managed-node2 25675 1727204013.35658: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204013.35661: Calling groups_plugins_play to load vars for managed-node2 25675 1727204013.36230: WORKER PROCESS EXITING 25675 1727204013.37548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204013.39536: done with get_vars() 25675 1727204013.39571: done getting variables 25675 1727204013.39651: in VariableManager get_vars() 25675 1727204013.39663: Calling all_inventory to load vars for managed-node2 25675 1727204013.39666: Calling groups_inventory to load vars for managed-node2 25675 1727204013.39668: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204013.39673: Calling all_plugins_play to load vars for managed-node2 25675 1727204013.39678: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204013.39681: Calling groups_plugins_play to load vars for managed-node2 25675 1727204013.40892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204013.42486: done with get_vars() 25675 1727204013.42514: done queuing things up, now waiting for results queue to drain 25675 1727204013.42516: results queue empty 25675 1727204013.42517: checking for any_errors_fatal 25675 1727204013.42520: done checking for any_errors_fatal 25675 1727204013.42521: checking for max_fail_percentage 25675 1727204013.42522: done checking for max_fail_percentage 25675 1727204013.42523: checking to see if all hosts have failed and the running result is not ok 25675 1727204013.42524: done checking to see if all hosts have failed 25675 1727204013.42525: getting the remaining hosts for this loop 25675 1727204013.42526: done getting the remaining hosts for this loop 25675 1727204013.42528: getting the next task for host managed-node2 25675 1727204013.42532: done getting next task for host managed-node2 25675 1727204013.42534: ^ task is: TASK: meta (flush_handlers) 25675 1727204013.42535: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204013.42538: getting variables 25675 1727204013.42539: in VariableManager get_vars() 25675 1727204013.42548: Calling all_inventory to load vars for managed-node2 25675 1727204013.42550: Calling groups_inventory to load vars for managed-node2 25675 1727204013.42552: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204013.42558: Calling all_plugins_play to load vars for managed-node2 25675 1727204013.42560: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204013.42563: Calling groups_plugins_play to load vars for managed-node2 25675 1727204013.43772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204013.45319: done with get_vars() 25675 1727204013.45339: done getting variables 25675 1727204013.45386: in VariableManager get_vars() 25675 1727204013.45401: Calling all_inventory to load vars for managed-node2 25675 1727204013.45403: Calling groups_inventory to load vars for managed-node2 25675 1727204013.45406: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204013.45411: Calling all_plugins_play to load vars for managed-node2 25675 1727204013.45413: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204013.45416: Calling groups_plugins_play to load vars for managed-node2 25675 1727204013.46612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204013.48347: done with get_vars() 25675 1727204013.48372: done queuing things up, now waiting for results queue to drain 25675 1727204013.48375: results queue empty 25675 1727204013.48377: checking for any_errors_fatal 25675 1727204013.48380: done checking for any_errors_fatal 25675 1727204013.48381: checking for max_fail_percentage 25675 1727204013.48382: done checking for max_fail_percentage 25675 1727204013.48382: checking to see if all hosts have failed and the running result is not ok 25675 1727204013.48383: done checking to see if all hosts have failed 25675 1727204013.48384: getting the remaining hosts for this loop 25675 1727204013.48385: done getting the remaining hosts for this loop 25675 1727204013.48387: getting the next task for host managed-node2 25675 1727204013.48390: done getting next task for host managed-node2 25675 1727204013.48391: ^ task is: None 25675 1727204013.48393: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204013.48394: done queuing things up, now waiting for results queue to drain 25675 1727204013.48395: results queue empty 25675 1727204013.48395: checking for any_errors_fatal 25675 1727204013.48396: done checking for any_errors_fatal 25675 1727204013.48396: checking for max_fail_percentage 25675 1727204013.48397: done checking for max_fail_percentage 25675 1727204013.48398: checking to see if all hosts have failed and the running result is not ok 25675 1727204013.48399: done checking to see if all hosts have failed 25675 1727204013.48400: getting the next task for host managed-node2 25675 1727204013.48402: done getting next task for host managed-node2 25675 1727204013.48402: ^ task is: None 25675 1727204013.48404: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204013.48449: in VariableManager get_vars() 25675 1727204013.48469: done with get_vars() 25675 1727204013.48474: in VariableManager get_vars() 25675 1727204013.48488: done with get_vars() 25675 1727204013.48492: variable 'omit' from source: magic vars 25675 1727204013.48609: variable 'profile' from source: play vars 25675 1727204013.48715: in VariableManager get_vars() 25675 1727204013.48729: done with get_vars() 25675 1727204013.48749: variable 'omit' from source: magic vars 25675 1727204013.48818: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 25675 1727204013.49499: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204013.49553: getting the remaining hosts for this loop 25675 1727204013.49555: done getting the remaining hosts for this loop 25675 1727204013.49557: getting the next task for host managed-node2 25675 1727204013.49560: done getting next task for host managed-node2 25675 1727204013.49562: ^ task is: TASK: Gathering Facts 25675 1727204013.49563: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204013.49565: getting variables 25675 1727204013.49566: in VariableManager get_vars() 25675 1727204013.49581: Calling all_inventory to load vars for managed-node2 25675 1727204013.49583: Calling groups_inventory to load vars for managed-node2 25675 1727204013.49586: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204013.49591: Calling all_plugins_play to load vars for managed-node2 25675 1727204013.49597: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204013.49601: Calling groups_plugins_play to load vars for managed-node2 25675 1727204013.50791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204013.52389: done with get_vars() 25675 1727204013.52421: done getting variables 25675 1727204013.52465: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:53:33 -0400 (0:00:00.562) 0:00:32.976 ***** 25675 1727204013.52491: entering _queue_task() for managed-node2/gather_facts 25675 1727204013.52859: worker is 1 (out of 1 available) 25675 1727204013.52872: exiting _queue_task() for managed-node2/gather_facts 25675 1727204013.52887: done queuing things up, now waiting for results queue to drain 25675 1727204013.52888: waiting for pending results... 25675 1727204013.53197: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204013.53245: in run() - task 028d2410-947f-41bd-b19d-000000000417 25675 1727204013.53269: variable 'ansible_search_path' from source: unknown 25675 1727204013.53320: calling self._execute() 25675 1727204013.53414: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204013.53425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204013.53436: variable 'omit' from source: magic vars 25675 1727204013.53835: variable 'ansible_distribution_major_version' from source: facts 25675 1727204013.53839: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204013.53841: variable 'omit' from source: magic vars 25675 1727204013.53853: variable 'omit' from source: magic vars 25675 1727204013.53895: variable 'omit' from source: magic vars 25675 1727204013.53935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204013.53979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204013.54006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204013.54053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204013.54056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204013.54087: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204013.54096: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204013.54104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204013.54272: Set connection var ansible_shell_type to sh 25675 1727204013.54279: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204013.54283: Set connection var ansible_timeout to 10 25675 1727204013.54285: Set connection var ansible_pipelining to False 25675 1727204013.54288: Set connection var ansible_shell_executable to /bin/sh 25675 1727204013.54291: Set connection var ansible_connection to ssh 25675 1727204013.54293: variable 'ansible_shell_executable' from source: unknown 25675 1727204013.54296: variable 'ansible_connection' from source: unknown 25675 1727204013.54298: variable 'ansible_module_compression' from source: unknown 25675 1727204013.54307: variable 'ansible_shell_type' from source: unknown 25675 1727204013.54314: variable 'ansible_shell_executable' from source: unknown 25675 1727204013.54322: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204013.54330: variable 'ansible_pipelining' from source: unknown 25675 1727204013.54337: variable 'ansible_timeout' from source: unknown 25675 1727204013.54344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204013.54600: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204013.54603: variable 'omit' from source: magic vars 25675 1727204013.54606: starting attempt loop 25675 1727204013.54608: running the handler 25675 1727204013.54611: variable 'ansible_facts' from source: unknown 25675 1727204013.54612: _low_level_execute_command(): starting 25675 1727204013.54620: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204013.55622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.55653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.55755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.57501: stdout chunk (state=3): >>>/root <<< 25675 1727204013.57640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.57652: stdout chunk (state=3): >>><<< 25675 1727204013.57671: stderr chunk (state=3): >>><<< 25675 1727204013.57801: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.57805: _low_level_execute_command(): starting 25675 1727204013.57808: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098 `" && echo ansible-tmp-1727204013.5770564-28509-228336121077098="` echo /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098 `" ) && sleep 0' 25675 1727204013.58359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.58370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.58431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.58505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.58532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.58641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.60553: stdout chunk (state=3): >>>ansible-tmp-1727204013.5770564-28509-228336121077098=/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098 <<< 25675 1727204013.60737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.60741: stdout chunk (state=3): >>><<< 25675 1727204013.60743: stderr chunk (state=3): >>><<< 25675 1727204013.60978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204013.5770564-28509-228336121077098=/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.60984: variable 'ansible_module_compression' from source: unknown 25675 1727204013.60986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204013.60988: variable 'ansible_facts' from source: unknown 25675 1727204013.61165: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py 25675 1727204013.61421: Sending initial data 25675 1727204013.61431: Sent initial data (154 bytes) 25675 1727204013.61958: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.61974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.61995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204013.62019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204013.62118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.62145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.62242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.63881: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204013.63940: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204013.64037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp999nd_nn /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py <<< 25675 1727204013.64040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py" <<< 25675 1727204013.64107: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp999nd_nn" to remote "/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py" <<< 25675 1727204013.65993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.65997: stdout chunk (state=3): >>><<< 25675 1727204013.66000: stderr chunk (state=3): >>><<< 25675 1727204013.66002: done transferring module to remote 25675 1727204013.66004: _low_level_execute_command(): starting 25675 1727204013.66006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/ /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py && sleep 0' 25675 1727204013.66628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.66642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.66653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204013.66668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204013.66689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204013.66743: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.66797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204013.66814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.66847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.66955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204013.68901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204013.68905: stdout chunk (state=3): >>><<< 25675 1727204013.68907: stderr chunk (state=3): >>><<< 25675 1727204013.68926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204013.68935: _low_level_execute_command(): starting 25675 1727204013.68945: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/AnsiballZ_setup.py && sleep 0' 25675 1727204013.69603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204013.69620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204013.69636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204013.69654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204013.69671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204013.69690: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204013.69704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.69723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204013.69737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204013.69749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204013.69797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204013.69851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204013.69870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204013.69901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204013.70017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204014.32914: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 600, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785714688, "block_size": 4096, "block_total": 65519099, "block_available": 63912528, "block_used": 1606571, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "34", "epoch": "1727204014", "epoch_int": "1727204014", "date": "2024-09-24", "time": "14:53:34", "iso8601_micro": "2024-09-24T18:53:34.289613Z", "iso8601": "2024-09-24T18:53:34Z", "iso8601_basic": "20240924T145334289613", "iso8601_basic_short": "20240924T145334", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.53515625, "5m": 0.44140625, "15m": 0.23486328125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204014.35039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204014.35084: stdout chunk (state=3): >>><<< 25675 1727204014.35199: stderr chunk (state=3): >>><<< 25675 1727204014.35287: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 600, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785714688, "block_size": 4096, "block_total": 65519099, "block_available": 63912528, "block_used": 1606571, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "34", "epoch": "1727204014", "epoch_int": "1727204014", "date": "2024-09-24", "time": "14:53:34", "iso8601_micro": "2024-09-24T18:53:34.289613Z", "iso8601": "2024-09-24T18:53:34Z", "iso8601_basic": "20240924T145334289613", "iso8601_basic_short": "20240924T145334", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.53515625, "5m": 0.44140625, "15m": 0.23486328125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204014.35930: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204014.35966: _low_level_execute_command(): starting 25675 1727204014.35982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204013.5770564-28509-228336121077098/ > /dev/null 2>&1 && sleep 0' 25675 1727204014.36630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204014.36644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204014.36660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204014.36685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204014.36703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204014.36797: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204014.36801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204014.36828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204014.36846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204014.36866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204014.37061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204014.38941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204014.39385: stdout chunk (state=3): >>><<< 25675 1727204014.39389: stderr chunk (state=3): >>><<< 25675 1727204014.39392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204014.39395: handler run complete 25675 1727204014.39397: variable 'ansible_facts' from source: unknown 25675 1727204014.39463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.40182: variable 'ansible_facts' from source: unknown 25675 1727204014.40271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.40801: attempt loop complete, returning result 25675 1727204014.40811: _execute() done 25675 1727204014.40818: dumping result to json 25675 1727204014.40853: done dumping result, returning 25675 1727204014.40867: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-000000000417] 25675 1727204014.40881: sending task result for task 028d2410-947f-41bd-b19d-000000000417 ok: [managed-node2] 25675 1727204014.41668: no more pending results, returning what we have 25675 1727204014.41671: results queue empty 25675 1727204014.41672: checking for any_errors_fatal 25675 1727204014.41673: done checking for any_errors_fatal 25675 1727204014.41674: checking for max_fail_percentage 25675 1727204014.41979: done checking for max_fail_percentage 25675 1727204014.41981: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.41982: done checking to see if all hosts have failed 25675 1727204014.41982: getting the remaining hosts for this loop 25675 1727204014.41984: done getting the remaining hosts for this loop 25675 1727204014.41988: getting the next task for host managed-node2 25675 1727204014.41994: done getting next task for host managed-node2 25675 1727204014.41996: ^ task is: TASK: meta (flush_handlers) 25675 1727204014.41998: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.42002: getting variables 25675 1727204014.42004: in VariableManager get_vars() 25675 1727204014.42038: Calling all_inventory to load vars for managed-node2 25675 1727204014.42041: Calling groups_inventory to load vars for managed-node2 25675 1727204014.42043: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.42054: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.42058: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.42060: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.42839: done sending task result for task 028d2410-947f-41bd-b19d-000000000417 25675 1727204014.42842: WORKER PROCESS EXITING 25675 1727204014.45034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.47555: done with get_vars() 25675 1727204014.47588: done getting variables 25675 1727204014.47667: in VariableManager get_vars() 25675 1727204014.47685: Calling all_inventory to load vars for managed-node2 25675 1727204014.47688: Calling groups_inventory to load vars for managed-node2 25675 1727204014.47690: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.47696: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.47698: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.47700: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.49130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.51737: done with get_vars() 25675 1727204014.51773: done queuing things up, now waiting for results queue to drain 25675 1727204014.51779: results queue empty 25675 1727204014.51785: checking for any_errors_fatal 25675 1727204014.51790: done checking for any_errors_fatal 25675 1727204014.51791: checking for max_fail_percentage 25675 1727204014.51792: done checking for max_fail_percentage 25675 1727204014.51793: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.51794: done checking to see if all hosts have failed 25675 1727204014.51794: getting the remaining hosts for this loop 25675 1727204014.51795: done getting the remaining hosts for this loop 25675 1727204014.51798: getting the next task for host managed-node2 25675 1727204014.51803: done getting next task for host managed-node2 25675 1727204014.51806: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727204014.51807: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.51817: getting variables 25675 1727204014.51818: in VariableManager get_vars() 25675 1727204014.51833: Calling all_inventory to load vars for managed-node2 25675 1727204014.51836: Calling groups_inventory to load vars for managed-node2 25675 1727204014.51838: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.51843: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.51845: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.51848: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.53838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.56158: done with get_vars() 25675 1727204014.56196: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:34 -0400 (0:00:01.037) 0:00:34.014 ***** 25675 1727204014.56294: entering _queue_task() for managed-node2/include_tasks 25675 1727204014.56702: worker is 1 (out of 1 available) 25675 1727204014.56715: exiting _queue_task() for managed-node2/include_tasks 25675 1727204014.56728: done queuing things up, now waiting for results queue to drain 25675 1727204014.56730: waiting for pending results... 25675 1727204014.57073: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25675 1727204014.57431: in run() - task 028d2410-947f-41bd-b19d-00000000005c 25675 1727204014.57505: variable 'ansible_search_path' from source: unknown 25675 1727204014.57520: variable 'ansible_search_path' from source: unknown 25675 1727204014.57652: calling self._execute() 25675 1727204014.57954: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.57959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.57961: variable 'omit' from source: magic vars 25675 1727204014.58742: variable 'ansible_distribution_major_version' from source: facts 25675 1727204014.58758: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204014.58798: _execute() done 25675 1727204014.58805: dumping result to json 25675 1727204014.58811: done dumping result, returning 25675 1727204014.58822: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-41bd-b19d-00000000005c] 25675 1727204014.59034: sending task result for task 028d2410-947f-41bd-b19d-00000000005c 25675 1727204014.59110: done sending task result for task 028d2410-947f-41bd-b19d-00000000005c 25675 1727204014.59113: WORKER PROCESS EXITING 25675 1727204014.59154: no more pending results, returning what we have 25675 1727204014.59159: in VariableManager get_vars() 25675 1727204014.59206: Calling all_inventory to load vars for managed-node2 25675 1727204014.59209: Calling groups_inventory to load vars for managed-node2 25675 1727204014.59211: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.59480: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.59485: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.59489: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.62010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.63902: done with get_vars() 25675 1727204014.63928: variable 'ansible_search_path' from source: unknown 25675 1727204014.63930: variable 'ansible_search_path' from source: unknown 25675 1727204014.63959: we have included files to process 25675 1727204014.63960: generating all_blocks data 25675 1727204014.63962: done generating all_blocks data 25675 1727204014.63962: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204014.63964: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204014.63966: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25675 1727204014.64568: done processing included file 25675 1727204014.64570: iterating over new_blocks loaded from include file 25675 1727204014.64572: in VariableManager get_vars() 25675 1727204014.64599: done with get_vars() 25675 1727204014.64601: filtering new block on tags 25675 1727204014.64618: done filtering new block on tags 25675 1727204014.64621: in VariableManager get_vars() 25675 1727204014.64667: done with get_vars() 25675 1727204014.64670: filtering new block on tags 25675 1727204014.64695: done filtering new block on tags 25675 1727204014.64698: in VariableManager get_vars() 25675 1727204014.64717: done with get_vars() 25675 1727204014.64719: filtering new block on tags 25675 1727204014.64735: done filtering new block on tags 25675 1727204014.64738: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 25675 1727204014.64743: extending task lists for all hosts with included blocks 25675 1727204014.65265: done extending task lists 25675 1727204014.65266: done processing included files 25675 1727204014.65267: results queue empty 25675 1727204014.65268: checking for any_errors_fatal 25675 1727204014.65269: done checking for any_errors_fatal 25675 1727204014.65270: checking for max_fail_percentage 25675 1727204014.65271: done checking for max_fail_percentage 25675 1727204014.65272: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.65273: done checking to see if all hosts have failed 25675 1727204014.65273: getting the remaining hosts for this loop 25675 1727204014.65275: done getting the remaining hosts for this loop 25675 1727204014.65282: getting the next task for host managed-node2 25675 1727204014.65286: done getting next task for host managed-node2 25675 1727204014.65288: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727204014.65291: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.65299: getting variables 25675 1727204014.65300: in VariableManager get_vars() 25675 1727204014.65318: Calling all_inventory to load vars for managed-node2 25675 1727204014.65320: Calling groups_inventory to load vars for managed-node2 25675 1727204014.65322: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.65327: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.65329: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.65332: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.66552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.68154: done with get_vars() 25675 1727204014.68184: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:34 -0400 (0:00:00.119) 0:00:34.133 ***** 25675 1727204014.68264: entering _queue_task() for managed-node2/setup 25675 1727204014.68641: worker is 1 (out of 1 available) 25675 1727204014.68655: exiting _queue_task() for managed-node2/setup 25675 1727204014.68671: done queuing things up, now waiting for results queue to drain 25675 1727204014.68673: waiting for pending results... 25675 1727204014.69093: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25675 1727204014.69109: in run() - task 028d2410-947f-41bd-b19d-000000000458 25675 1727204014.69134: variable 'ansible_search_path' from source: unknown 25675 1727204014.69143: variable 'ansible_search_path' from source: unknown 25675 1727204014.69190: calling self._execute() 25675 1727204014.69352: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.69356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.69360: variable 'omit' from source: magic vars 25675 1727204014.69698: variable 'ansible_distribution_major_version' from source: facts 25675 1727204014.69716: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204014.69936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204014.72243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204014.72320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204014.72361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204014.72481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204014.72486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204014.72526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204014.72562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204014.72612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204014.72648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204014.72680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204014.72732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204014.73080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204014.73083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204014.73085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204014.73087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204014.73174: variable '__network_required_facts' from source: role '' defaults 25675 1727204014.73204: variable 'ansible_facts' from source: unknown 25675 1727204014.73987: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25675 1727204014.73996: when evaluation is False, skipping this task 25675 1727204014.74003: _execute() done 25675 1727204014.74010: dumping result to json 25675 1727204014.74018: done dumping result, returning 25675 1727204014.74040: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-41bd-b19d-000000000458] 25675 1727204014.74050: sending task result for task 028d2410-947f-41bd-b19d-000000000458 25675 1727204014.74285: done sending task result for task 028d2410-947f-41bd-b19d-000000000458 25675 1727204014.74288: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204014.74340: no more pending results, returning what we have 25675 1727204014.74344: results queue empty 25675 1727204014.74345: checking for any_errors_fatal 25675 1727204014.74347: done checking for any_errors_fatal 25675 1727204014.74347: checking for max_fail_percentage 25675 1727204014.74349: done checking for max_fail_percentage 25675 1727204014.74350: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.74351: done checking to see if all hosts have failed 25675 1727204014.74352: getting the remaining hosts for this loop 25675 1727204014.74353: done getting the remaining hosts for this loop 25675 1727204014.74358: getting the next task for host managed-node2 25675 1727204014.74368: done getting next task for host managed-node2 25675 1727204014.74372: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727204014.74379: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.74395: getting variables 25675 1727204014.74584: in VariableManager get_vars() 25675 1727204014.74627: Calling all_inventory to load vars for managed-node2 25675 1727204014.74630: Calling groups_inventory to load vars for managed-node2 25675 1727204014.74633: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.74644: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.74647: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.74650: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.76291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.77945: done with get_vars() 25675 1727204014.77974: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:34 -0400 (0:00:00.098) 0:00:34.231 ***** 25675 1727204014.78083: entering _queue_task() for managed-node2/stat 25675 1727204014.78451: worker is 1 (out of 1 available) 25675 1727204014.78463: exiting _queue_task() for managed-node2/stat 25675 1727204014.78583: done queuing things up, now waiting for results queue to drain 25675 1727204014.78585: waiting for pending results... 25675 1727204014.78825: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25675 1727204014.78928: in run() - task 028d2410-947f-41bd-b19d-00000000045a 25675 1727204014.78948: variable 'ansible_search_path' from source: unknown 25675 1727204014.78983: variable 'ansible_search_path' from source: unknown 25675 1727204014.79001: calling self._execute() 25675 1727204014.79105: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.79117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.79283: variable 'omit' from source: magic vars 25675 1727204014.79525: variable 'ansible_distribution_major_version' from source: facts 25675 1727204014.79542: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204014.79711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204014.80011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204014.80071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204014.80117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204014.80163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204014.80261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204014.80303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204014.80336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204014.80370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204014.80474: variable '__network_is_ostree' from source: set_fact 25675 1727204014.80603: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727204014.80606: when evaluation is False, skipping this task 25675 1727204014.80609: _execute() done 25675 1727204014.80611: dumping result to json 25675 1727204014.80613: done dumping result, returning 25675 1727204014.80616: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-41bd-b19d-00000000045a] 25675 1727204014.80618: sending task result for task 028d2410-947f-41bd-b19d-00000000045a 25675 1727204014.80694: done sending task result for task 028d2410-947f-41bd-b19d-00000000045a 25675 1727204014.80698: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727204014.80757: no more pending results, returning what we have 25675 1727204014.80761: results queue empty 25675 1727204014.80762: checking for any_errors_fatal 25675 1727204014.80771: done checking for any_errors_fatal 25675 1727204014.80771: checking for max_fail_percentage 25675 1727204014.80773: done checking for max_fail_percentage 25675 1727204014.80774: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.80779: done checking to see if all hosts have failed 25675 1727204014.80780: getting the remaining hosts for this loop 25675 1727204014.80782: done getting the remaining hosts for this loop 25675 1727204014.80786: getting the next task for host managed-node2 25675 1727204014.80792: done getting next task for host managed-node2 25675 1727204014.80796: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727204014.80799: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.80815: getting variables 25675 1727204014.80817: in VariableManager get_vars() 25675 1727204014.80861: Calling all_inventory to load vars for managed-node2 25675 1727204014.80864: Calling groups_inventory to load vars for managed-node2 25675 1727204014.80867: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.80984: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.80993: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.80997: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.82615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.84353: done with get_vars() 25675 1727204014.84383: done getting variables 25675 1727204014.84443: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:34 -0400 (0:00:00.064) 0:00:34.296 ***** 25675 1727204014.84487: entering _queue_task() for managed-node2/set_fact 25675 1727204014.84847: worker is 1 (out of 1 available) 25675 1727204014.84860: exiting _queue_task() for managed-node2/set_fact 25675 1727204014.84874: done queuing things up, now waiting for results queue to drain 25675 1727204014.84879: waiting for pending results... 25675 1727204014.85292: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25675 1727204014.85298: in run() - task 028d2410-947f-41bd-b19d-00000000045b 25675 1727204014.85301: variable 'ansible_search_path' from source: unknown 25675 1727204014.85304: variable 'ansible_search_path' from source: unknown 25675 1727204014.85321: calling self._execute() 25675 1727204014.85461: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.85466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.85469: variable 'omit' from source: magic vars 25675 1727204014.85901: variable 'ansible_distribution_major_version' from source: facts 25675 1727204014.85918: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204014.86103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204014.86410: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204014.86467: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204014.86548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204014.86552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204014.86642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204014.86686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204014.86764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204014.86767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204014.86850: variable '__network_is_ostree' from source: set_fact 25675 1727204014.86863: Evaluated conditional (not __network_is_ostree is defined): False 25675 1727204014.86872: when evaluation is False, skipping this task 25675 1727204014.86890: _execute() done 25675 1727204014.86983: dumping result to json 25675 1727204014.86988: done dumping result, returning 25675 1727204014.86991: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-41bd-b19d-00000000045b] 25675 1727204014.86994: sending task result for task 028d2410-947f-41bd-b19d-00000000045b 25675 1727204014.87059: done sending task result for task 028d2410-947f-41bd-b19d-00000000045b 25675 1727204014.87063: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25675 1727204014.87118: no more pending results, returning what we have 25675 1727204014.87122: results queue empty 25675 1727204014.87123: checking for any_errors_fatal 25675 1727204014.87131: done checking for any_errors_fatal 25675 1727204014.87132: checking for max_fail_percentage 25675 1727204014.87134: done checking for max_fail_percentage 25675 1727204014.87135: checking to see if all hosts have failed and the running result is not ok 25675 1727204014.87136: done checking to see if all hosts have failed 25675 1727204014.87137: getting the remaining hosts for this loop 25675 1727204014.87138: done getting the remaining hosts for this loop 25675 1727204014.87143: getting the next task for host managed-node2 25675 1727204014.87152: done getting next task for host managed-node2 25675 1727204014.87156: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727204014.87159: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204014.87179: getting variables 25675 1727204014.87181: in VariableManager get_vars() 25675 1727204014.87225: Calling all_inventory to load vars for managed-node2 25675 1727204014.87229: Calling groups_inventory to load vars for managed-node2 25675 1727204014.87231: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204014.87243: Calling all_plugins_play to load vars for managed-node2 25675 1727204014.87246: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204014.87250: Calling groups_plugins_play to load vars for managed-node2 25675 1727204014.88938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204014.90558: done with get_vars() 25675 1727204014.90598: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:34 -0400 (0:00:00.062) 0:00:34.358 ***** 25675 1727204014.90718: entering _queue_task() for managed-node2/service_facts 25675 1727204014.91114: worker is 1 (out of 1 available) 25675 1727204014.91128: exiting _queue_task() for managed-node2/service_facts 25675 1727204014.91141: done queuing things up, now waiting for results queue to drain 25675 1727204014.91143: waiting for pending results... 25675 1727204014.91493: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 25675 1727204014.91625: in run() - task 028d2410-947f-41bd-b19d-00000000045d 25675 1727204014.91630: variable 'ansible_search_path' from source: unknown 25675 1727204014.91634: variable 'ansible_search_path' from source: unknown 25675 1727204014.91638: calling self._execute() 25675 1727204014.91732: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.91747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.91764: variable 'omit' from source: magic vars 25675 1727204014.92173: variable 'ansible_distribution_major_version' from source: facts 25675 1727204014.92192: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204014.92205: variable 'omit' from source: magic vars 25675 1727204014.92266: variable 'omit' from source: magic vars 25675 1727204014.92313: variable 'omit' from source: magic vars 25675 1727204014.92383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204014.92410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204014.92436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204014.92488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204014.92492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204014.92519: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204014.92530: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.92538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.92684: Set connection var ansible_shell_type to sh 25675 1727204014.92687: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204014.92690: Set connection var ansible_timeout to 10 25675 1727204014.92692: Set connection var ansible_pipelining to False 25675 1727204014.92694: Set connection var ansible_shell_executable to /bin/sh 25675 1727204014.92696: Set connection var ansible_connection to ssh 25675 1727204014.92722: variable 'ansible_shell_executable' from source: unknown 25675 1727204014.92732: variable 'ansible_connection' from source: unknown 25675 1727204014.92740: variable 'ansible_module_compression' from source: unknown 25675 1727204014.92747: variable 'ansible_shell_type' from source: unknown 25675 1727204014.92819: variable 'ansible_shell_executable' from source: unknown 25675 1727204014.92822: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204014.92824: variable 'ansible_pipelining' from source: unknown 25675 1727204014.92827: variable 'ansible_timeout' from source: unknown 25675 1727204014.92829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204014.92994: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204014.93011: variable 'omit' from source: magic vars 25675 1727204014.93022: starting attempt loop 25675 1727204014.93035: running the handler 25675 1727204014.93056: _low_level_execute_command(): starting 25675 1727204014.93072: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204014.93813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204014.93868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204014.93904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204014.93947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204014.94026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204014.95743: stdout chunk (state=3): >>>/root <<< 25675 1727204014.95907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204014.95911: stdout chunk (state=3): >>><<< 25675 1727204014.95913: stderr chunk (state=3): >>><<< 25675 1727204014.96035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204014.96040: _low_level_execute_command(): starting 25675 1727204014.96044: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139 `" && echo ansible-tmp-1727204014.9593642-28567-168740170737139="` echo /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139 `" ) && sleep 0' 25675 1727204014.96617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204014.96633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204014.96698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204014.96774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204014.96795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204014.96819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204014.96933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204014.98888: stdout chunk (state=3): >>>ansible-tmp-1727204014.9593642-28567-168740170737139=/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139 <<< 25675 1727204014.98982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204014.99026: stderr chunk (state=3): >>><<< 25675 1727204014.99029: stdout chunk (state=3): >>><<< 25675 1727204014.99044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204014.9593642-28567-168740170737139=/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204014.99181: variable 'ansible_module_compression' from source: unknown 25675 1727204014.99184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 25675 1727204014.99186: variable 'ansible_facts' from source: unknown 25675 1727204014.99273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py 25675 1727204014.99493: Sending initial data 25675 1727204014.99503: Sent initial data (162 bytes) 25675 1727204015.00026: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204015.00042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204015.00060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204015.00089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204015.00189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204015.00208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204015.00324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204015.01905: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204015.01987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204015.02073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpargclwft /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py <<< 25675 1727204015.02080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py" <<< 25675 1727204015.02147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpargclwft" to remote "/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py" <<< 25675 1727204015.03384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204015.03388: stderr chunk (state=3): >>><<< 25675 1727204015.03390: stdout chunk (state=3): >>><<< 25675 1727204015.03392: done transferring module to remote 25675 1727204015.03394: _low_level_execute_command(): starting 25675 1727204015.03397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/ /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py && sleep 0' 25675 1727204015.04047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204015.04113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204015.04131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204015.04160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204015.04264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204015.06166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204015.06193: stdout chunk (state=3): >>><<< 25675 1727204015.06209: stderr chunk (state=3): >>><<< 25675 1727204015.06229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204015.06238: _low_level_execute_command(): starting 25675 1727204015.06325: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/AnsiballZ_service_facts.py && sleep 0' 25675 1727204015.07481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204015.07485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204015.07488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204015.07490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204015.07882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204015.07890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204015.07909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204015.08006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.61330: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 25675 1727204016.61423: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 25675 1727204016.61438: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25675 1727204016.63128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204016.63132: stdout chunk (state=3): >>><<< 25675 1727204016.63135: stderr chunk (state=3): >>><<< 25675 1727204016.63140: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204016.64162: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204016.64257: _low_level_execute_command(): starting 25675 1727204016.64291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204014.9593642-28567-168740170737139/ > /dev/null 2>&1 && sleep 0' 25675 1727204016.65366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204016.65384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204016.65400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204016.65419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204016.65436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204016.65450: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204016.65467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.65493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204016.65506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204016.65518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204016.65580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.65618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204016.65638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204016.65701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.65905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.68086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204016.68090: stdout chunk (state=3): >>><<< 25675 1727204016.68093: stderr chunk (state=3): >>><<< 25675 1727204016.68095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204016.68097: handler run complete 25675 1727204016.68307: variable 'ansible_facts' from source: unknown 25675 1727204016.68466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204016.69284: variable 'ansible_facts' from source: unknown 25675 1727204016.69418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204016.69630: attempt loop complete, returning result 25675 1727204016.69640: _execute() done 25675 1727204016.69646: dumping result to json 25675 1727204016.69717: done dumping result, returning 25675 1727204016.69732: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-41bd-b19d-00000000045d] 25675 1727204016.69743: sending task result for task 028d2410-947f-41bd-b19d-00000000045d 25675 1727204016.71081: done sending task result for task 028d2410-947f-41bd-b19d-00000000045d 25675 1727204016.71085: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204016.71192: no more pending results, returning what we have 25675 1727204016.71195: results queue empty 25675 1727204016.71195: checking for any_errors_fatal 25675 1727204016.71199: done checking for any_errors_fatal 25675 1727204016.71200: checking for max_fail_percentage 25675 1727204016.71202: done checking for max_fail_percentage 25675 1727204016.71202: checking to see if all hosts have failed and the running result is not ok 25675 1727204016.71203: done checking to see if all hosts have failed 25675 1727204016.71204: getting the remaining hosts for this loop 25675 1727204016.71205: done getting the remaining hosts for this loop 25675 1727204016.71209: getting the next task for host managed-node2 25675 1727204016.71214: done getting next task for host managed-node2 25675 1727204016.71217: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727204016.71219: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204016.71228: getting variables 25675 1727204016.71229: in VariableManager get_vars() 25675 1727204016.71258: Calling all_inventory to load vars for managed-node2 25675 1727204016.71261: Calling groups_inventory to load vars for managed-node2 25675 1727204016.71263: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204016.71271: Calling all_plugins_play to load vars for managed-node2 25675 1727204016.71274: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204016.71282: Calling groups_plugins_play to load vars for managed-node2 25675 1727204016.72454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204016.74112: done with get_vars() 25675 1727204016.74140: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:36 -0400 (0:00:01.835) 0:00:36.193 ***** 25675 1727204016.74245: entering _queue_task() for managed-node2/package_facts 25675 1727204016.74592: worker is 1 (out of 1 available) 25675 1727204016.74605: exiting _queue_task() for managed-node2/package_facts 25675 1727204016.74617: done queuing things up, now waiting for results queue to drain 25675 1727204016.74618: waiting for pending results... 25675 1727204016.74895: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25675 1727204016.75033: in run() - task 028d2410-947f-41bd-b19d-00000000045e 25675 1727204016.75054: variable 'ansible_search_path' from source: unknown 25675 1727204016.75062: variable 'ansible_search_path' from source: unknown 25675 1727204016.75202: calling self._execute() 25675 1727204016.75210: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204016.75222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204016.75235: variable 'omit' from source: magic vars 25675 1727204016.75615: variable 'ansible_distribution_major_version' from source: facts 25675 1727204016.75634: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204016.75646: variable 'omit' from source: magic vars 25675 1727204016.75727: variable 'omit' from source: magic vars 25675 1727204016.75769: variable 'omit' from source: magic vars 25675 1727204016.75815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204016.75860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204016.75895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204016.75920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204016.75945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204016.75990: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204016.75999: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204016.76006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204016.76293: Set connection var ansible_shell_type to sh 25675 1727204016.76296: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204016.76298: Set connection var ansible_timeout to 10 25675 1727204016.76300: Set connection var ansible_pipelining to False 25675 1727204016.76302: Set connection var ansible_shell_executable to /bin/sh 25675 1727204016.76304: Set connection var ansible_connection to ssh 25675 1727204016.76306: variable 'ansible_shell_executable' from source: unknown 25675 1727204016.76308: variable 'ansible_connection' from source: unknown 25675 1727204016.76310: variable 'ansible_module_compression' from source: unknown 25675 1727204016.76312: variable 'ansible_shell_type' from source: unknown 25675 1727204016.76314: variable 'ansible_shell_executable' from source: unknown 25675 1727204016.76316: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204016.76318: variable 'ansible_pipelining' from source: unknown 25675 1727204016.76320: variable 'ansible_timeout' from source: unknown 25675 1727204016.76322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204016.76433: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204016.76451: variable 'omit' from source: magic vars 25675 1727204016.76460: starting attempt loop 25675 1727204016.76466: running the handler 25675 1727204016.76490: _low_level_execute_command(): starting 25675 1727204016.76502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204016.77166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204016.77191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.77239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204016.77254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.77329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.79183: stdout chunk (state=3): >>>/root <<< 25675 1727204016.79202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204016.79312: stdout chunk (state=3): >>><<< 25675 1727204016.79315: stderr chunk (state=3): >>><<< 25675 1727204016.79319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204016.79321: _low_level_execute_command(): starting 25675 1727204016.79323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921 `" && echo ansible-tmp-1727204016.7923717-28706-61635488627921="` echo /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921 `" ) && sleep 0' 25675 1727204016.80346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.80440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204016.80483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204016.80508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.80702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.82614: stdout chunk (state=3): >>>ansible-tmp-1727204016.7923717-28706-61635488627921=/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921 <<< 25675 1727204016.82745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204016.82752: stdout chunk (state=3): >>><<< 25675 1727204016.82760: stderr chunk (state=3): >>><<< 25675 1727204016.82781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204016.7923717-28706-61635488627921=/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204016.82818: variable 'ansible_module_compression' from source: unknown 25675 1727204016.82855: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 25675 1727204016.82907: variable 'ansible_facts' from source: unknown 25675 1727204016.83022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py 25675 1727204016.83165: Sending initial data 25675 1727204016.83168: Sent initial data (161 bytes) 25675 1727204016.83812: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727204016.83816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.83895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.83972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.85521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204016.85585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204016.85656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpykicppfy /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py <<< 25675 1727204016.85660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py" <<< 25675 1727204016.85724: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpykicppfy" to remote "/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py" <<< 25675 1727204016.86916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204016.86959: stderr chunk (state=3): >>><<< 25675 1727204016.86962: stdout chunk (state=3): >>><<< 25675 1727204016.87006: done transferring module to remote 25675 1727204016.87016: _low_level_execute_command(): starting 25675 1727204016.87023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/ /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py && sleep 0' 25675 1727204016.87465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204016.87473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.87492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.87545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204016.87551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204016.87554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.87623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204016.89432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204016.89460: stderr chunk (state=3): >>><<< 25675 1727204016.89464: stdout chunk (state=3): >>><<< 25675 1727204016.89480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204016.89486: _low_level_execute_command(): starting 25675 1727204016.89491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/AnsiballZ_package_facts.py && sleep 0' 25675 1727204016.89919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204016.89923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.89934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204016.89983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204016.89998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204016.90081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204017.34528: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 25675 1727204017.34682: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25675 1727204017.36396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204017.36400: stdout chunk (state=3): >>><<< 25675 1727204017.36403: stderr chunk (state=3): >>><<< 25675 1727204017.36439: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204017.53834: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204017.53865: _low_level_execute_command(): starting 25675 1727204017.54156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204016.7923717-28706-61635488627921/ > /dev/null 2>&1 && sleep 0' 25675 1727204017.55209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204017.55213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204017.55395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204017.55599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204017.55628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204017.57544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204017.57586: stderr chunk (state=3): >>><<< 25675 1727204017.57598: stdout chunk (state=3): >>><<< 25675 1727204017.57882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204017.57886: handler run complete 25675 1727204017.59358: variable 'ansible_facts' from source: unknown 25675 1727204017.60216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204017.64201: variable 'ansible_facts' from source: unknown 25675 1727204017.65283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204017.66484: attempt loop complete, returning result 25675 1727204017.66514: _execute() done 25675 1727204017.66532: dumping result to json 25675 1727204017.66761: done dumping result, returning 25675 1727204017.66779: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-41bd-b19d-00000000045e] 25675 1727204017.66790: sending task result for task 028d2410-947f-41bd-b19d-00000000045e 25675 1727204017.77026: done sending task result for task 028d2410-947f-41bd-b19d-00000000045e 25675 1727204017.77029: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204017.77121: no more pending results, returning what we have 25675 1727204017.77124: results queue empty 25675 1727204017.77125: checking for any_errors_fatal 25675 1727204017.77129: done checking for any_errors_fatal 25675 1727204017.77130: checking for max_fail_percentage 25675 1727204017.77131: done checking for max_fail_percentage 25675 1727204017.77132: checking to see if all hosts have failed and the running result is not ok 25675 1727204017.77133: done checking to see if all hosts have failed 25675 1727204017.77133: getting the remaining hosts for this loop 25675 1727204017.77134: done getting the remaining hosts for this loop 25675 1727204017.77137: getting the next task for host managed-node2 25675 1727204017.77142: done getting next task for host managed-node2 25675 1727204017.77145: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25675 1727204017.77147: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204017.77155: getting variables 25675 1727204017.77156: in VariableManager get_vars() 25675 1727204017.77296: Calling all_inventory to load vars for managed-node2 25675 1727204017.77299: Calling groups_inventory to load vars for managed-node2 25675 1727204017.77301: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204017.77309: Calling all_plugins_play to load vars for managed-node2 25675 1727204017.77311: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204017.77314: Calling groups_plugins_play to load vars for managed-node2 25675 1727204017.79871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204017.83360: done with get_vars() 25675 1727204017.83539: done getting variables 25675 1727204017.83598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:37 -0400 (0:00:01.093) 0:00:37.287 ***** 25675 1727204017.83626: entering _queue_task() for managed-node2/debug 25675 1727204017.84369: worker is 1 (out of 1 available) 25675 1727204017.84386: exiting _queue_task() for managed-node2/debug 25675 1727204017.84486: done queuing things up, now waiting for results queue to drain 25675 1727204017.84488: waiting for pending results... 25675 1727204017.85093: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 25675 1727204017.85098: in run() - task 028d2410-947f-41bd-b19d-00000000005d 25675 1727204017.85101: variable 'ansible_search_path' from source: unknown 25675 1727204017.85104: variable 'ansible_search_path' from source: unknown 25675 1727204017.85298: calling self._execute() 25675 1727204017.85395: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204017.85406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204017.85419: variable 'omit' from source: magic vars 25675 1727204017.86187: variable 'ansible_distribution_major_version' from source: facts 25675 1727204017.86202: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204017.86214: variable 'omit' from source: magic vars 25675 1727204017.86255: variable 'omit' from source: magic vars 25675 1727204017.86680: variable 'network_provider' from source: set_fact 25675 1727204017.86684: variable 'omit' from source: magic vars 25675 1727204017.86687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204017.86690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204017.86708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204017.87080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204017.87083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204017.87086: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204017.87088: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204017.87091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204017.87092: Set connection var ansible_shell_type to sh 25675 1727204017.87094: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204017.87096: Set connection var ansible_timeout to 10 25675 1727204017.87098: Set connection var ansible_pipelining to False 25675 1727204017.87100: Set connection var ansible_shell_executable to /bin/sh 25675 1727204017.87103: Set connection var ansible_connection to ssh 25675 1727204017.87304: variable 'ansible_shell_executable' from source: unknown 25675 1727204017.87313: variable 'ansible_connection' from source: unknown 25675 1727204017.87320: variable 'ansible_module_compression' from source: unknown 25675 1727204017.87330: variable 'ansible_shell_type' from source: unknown 25675 1727204017.87333: variable 'ansible_shell_executable' from source: unknown 25675 1727204017.87336: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204017.87338: variable 'ansible_pipelining' from source: unknown 25675 1727204017.87340: variable 'ansible_timeout' from source: unknown 25675 1727204017.87345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204017.87881: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204017.87884: variable 'omit' from source: magic vars 25675 1727204017.87887: starting attempt loop 25675 1727204017.87889: running the handler 25675 1727204017.87892: handler run complete 25675 1727204017.87894: attempt loop complete, returning result 25675 1727204017.87896: _execute() done 25675 1727204017.87898: dumping result to json 25675 1727204017.87900: done dumping result, returning 25675 1727204017.87903: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-41bd-b19d-00000000005d] 25675 1727204017.87905: sending task result for task 028d2410-947f-41bd-b19d-00000000005d 25675 1727204017.87979: done sending task result for task 028d2410-947f-41bd-b19d-00000000005d ok: [managed-node2] => {} MSG: Using network provider: nm 25675 1727204017.88042: WORKER PROCESS EXITING 25675 1727204017.88114: no more pending results, returning what we have 25675 1727204017.88118: results queue empty 25675 1727204017.88119: checking for any_errors_fatal 25675 1727204017.88130: done checking for any_errors_fatal 25675 1727204017.88131: checking for max_fail_percentage 25675 1727204017.88133: done checking for max_fail_percentage 25675 1727204017.88134: checking to see if all hosts have failed and the running result is not ok 25675 1727204017.88135: done checking to see if all hosts have failed 25675 1727204017.88135: getting the remaining hosts for this loop 25675 1727204017.88137: done getting the remaining hosts for this loop 25675 1727204017.88141: getting the next task for host managed-node2 25675 1727204017.88146: done getting next task for host managed-node2 25675 1727204017.88151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727204017.88152: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204017.88162: getting variables 25675 1727204017.88163: in VariableManager get_vars() 25675 1727204017.88203: Calling all_inventory to load vars for managed-node2 25675 1727204017.88206: Calling groups_inventory to load vars for managed-node2 25675 1727204017.88208: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204017.88330: Calling all_plugins_play to load vars for managed-node2 25675 1727204017.88334: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204017.88337: Calling groups_plugins_play to load vars for managed-node2 25675 1727204017.91230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204017.94941: done with get_vars() 25675 1727204017.94968: done getting variables 25675 1727204017.95333: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:37 -0400 (0:00:00.117) 0:00:37.404 ***** 25675 1727204017.95366: entering _queue_task() for managed-node2/fail 25675 1727204017.96118: worker is 1 (out of 1 available) 25675 1727204017.96130: exiting _queue_task() for managed-node2/fail 25675 1727204017.96142: done queuing things up, now waiting for results queue to drain 25675 1727204017.96144: waiting for pending results... 25675 1727204017.96413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25675 1727204017.96772: in run() - task 028d2410-947f-41bd-b19d-00000000005e 25675 1727204017.96801: variable 'ansible_search_path' from source: unknown 25675 1727204017.96813: variable 'ansible_search_path' from source: unknown 25675 1727204017.96861: calling self._execute() 25675 1727204017.97096: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204017.97113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204017.97214: variable 'omit' from source: magic vars 25675 1727204017.98001: variable 'ansible_distribution_major_version' from source: facts 25675 1727204017.98020: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204017.98325: variable 'network_state' from source: role '' defaults 25675 1727204017.98344: Evaluated conditional (network_state != {}): False 25675 1727204017.98354: when evaluation is False, skipping this task 25675 1727204017.98362: _execute() done 25675 1727204017.98370: dumping result to json 25675 1727204017.98382: done dumping result, returning 25675 1727204017.98395: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-41bd-b19d-00000000005e] 25675 1727204017.98434: sending task result for task 028d2410-947f-41bd-b19d-00000000005e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204017.98660: no more pending results, returning what we have 25675 1727204017.98664: results queue empty 25675 1727204017.98665: checking for any_errors_fatal 25675 1727204017.98674: done checking for any_errors_fatal 25675 1727204017.98677: checking for max_fail_percentage 25675 1727204017.98679: done checking for max_fail_percentage 25675 1727204017.98680: checking to see if all hosts have failed and the running result is not ok 25675 1727204017.98681: done checking to see if all hosts have failed 25675 1727204017.98682: getting the remaining hosts for this loop 25675 1727204017.98683: done getting the remaining hosts for this loop 25675 1727204017.98687: getting the next task for host managed-node2 25675 1727204017.98693: done getting next task for host managed-node2 25675 1727204017.98697: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727204017.98700: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204017.98714: getting variables 25675 1727204017.98716: in VariableManager get_vars() 25675 1727204017.98753: Calling all_inventory to load vars for managed-node2 25675 1727204017.98756: Calling groups_inventory to load vars for managed-node2 25675 1727204017.98759: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204017.98772: Calling all_plugins_play to load vars for managed-node2 25675 1727204017.98979: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204017.98985: Calling groups_plugins_play to load vars for managed-node2 25675 1727204017.99784: done sending task result for task 028d2410-947f-41bd-b19d-00000000005e 25675 1727204017.99788: WORKER PROCESS EXITING 25675 1727204018.01522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.04706: done with get_vars() 25675 1727204018.04736: done getting variables 25675 1727204018.05008: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.096) 0:00:37.501 ***** 25675 1727204018.05039: entering _queue_task() for managed-node2/fail 25675 1727204018.05599: worker is 1 (out of 1 available) 25675 1727204018.05612: exiting _queue_task() for managed-node2/fail 25675 1727204018.05625: done queuing things up, now waiting for results queue to drain 25675 1727204018.05626: waiting for pending results... 25675 1727204018.06101: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25675 1727204018.06386: in run() - task 028d2410-947f-41bd-b19d-00000000005f 25675 1727204018.06410: variable 'ansible_search_path' from source: unknown 25675 1727204018.06447: variable 'ansible_search_path' from source: unknown 25675 1727204018.06683: calling self._execute() 25675 1727204018.06789: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.06801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.06814: variable 'omit' from source: magic vars 25675 1727204018.07489: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.07506: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.07637: variable 'network_state' from source: role '' defaults 25675 1727204018.07654: Evaluated conditional (network_state != {}): False 25675 1727204018.07662: when evaluation is False, skipping this task 25675 1727204018.07670: _execute() done 25675 1727204018.07682: dumping result to json 25675 1727204018.07692: done dumping result, returning 25675 1727204018.07703: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-41bd-b19d-00000000005f] 25675 1727204018.07713: sending task result for task 028d2410-947f-41bd-b19d-00000000005f 25675 1727204018.08086: done sending task result for task 028d2410-947f-41bd-b19d-00000000005f 25675 1727204018.08090: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204018.08133: no more pending results, returning what we have 25675 1727204018.08136: results queue empty 25675 1727204018.08137: checking for any_errors_fatal 25675 1727204018.08143: done checking for any_errors_fatal 25675 1727204018.08144: checking for max_fail_percentage 25675 1727204018.08146: done checking for max_fail_percentage 25675 1727204018.08147: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.08148: done checking to see if all hosts have failed 25675 1727204018.08149: getting the remaining hosts for this loop 25675 1727204018.08150: done getting the remaining hosts for this loop 25675 1727204018.08153: getting the next task for host managed-node2 25675 1727204018.08158: done getting next task for host managed-node2 25675 1727204018.08162: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727204018.08164: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.08184: getting variables 25675 1727204018.08186: in VariableManager get_vars() 25675 1727204018.08225: Calling all_inventory to load vars for managed-node2 25675 1727204018.08228: Calling groups_inventory to load vars for managed-node2 25675 1727204018.08231: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.08241: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.08244: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.08247: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.09971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.12275: done with get_vars() 25675 1727204018.12394: done getting variables 25675 1727204018.12586: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.075) 0:00:37.577 ***** 25675 1727204018.12618: entering _queue_task() for managed-node2/fail 25675 1727204018.13016: worker is 1 (out of 1 available) 25675 1727204018.13029: exiting _queue_task() for managed-node2/fail 25675 1727204018.13041: done queuing things up, now waiting for results queue to drain 25675 1727204018.13043: waiting for pending results... 25675 1727204018.13348: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25675 1727204018.13482: in run() - task 028d2410-947f-41bd-b19d-000000000060 25675 1727204018.13506: variable 'ansible_search_path' from source: unknown 25675 1727204018.13515: variable 'ansible_search_path' from source: unknown 25675 1727204018.13567: calling self._execute() 25675 1727204018.13680: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.13695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.13767: variable 'omit' from source: magic vars 25675 1727204018.14131: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.14149: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.14329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.17048: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.17136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.17192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.17240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.17311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.17384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.17438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.17583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.17588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.17591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.17659: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.17689: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25675 1727204018.17823: variable 'ansible_distribution' from source: facts 25675 1727204018.17832: variable '__network_rh_distros' from source: role '' defaults 25675 1727204018.17845: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25675 1727204018.18113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.18182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.18186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.18223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.18252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.18310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.18338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.18382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.18469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.18472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.18494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.18524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.18555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.18612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.18630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.18960: variable 'network_connections' from source: play vars 25675 1727204018.19013: variable 'profile' from source: play vars 25675 1727204018.19064: variable 'profile' from source: play vars 25675 1727204018.19079: variable 'interface' from source: set_fact 25675 1727204018.19150: variable 'interface' from source: set_fact 25675 1727204018.19227: variable 'network_state' from source: role '' defaults 25675 1727204018.19247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204018.19436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204018.19489: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204018.19522: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204018.19562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204018.19615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204018.19769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204018.19772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.19775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204018.19782: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25675 1727204018.19784: when evaluation is False, skipping this task 25675 1727204018.19787: _execute() done 25675 1727204018.19789: dumping result to json 25675 1727204018.19791: done dumping result, returning 25675 1727204018.19793: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-41bd-b19d-000000000060] 25675 1727204018.19795: sending task result for task 028d2410-947f-41bd-b19d-000000000060 skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25675 1727204018.20128: no more pending results, returning what we have 25675 1727204018.20131: results queue empty 25675 1727204018.20132: checking for any_errors_fatal 25675 1727204018.20139: done checking for any_errors_fatal 25675 1727204018.20140: checking for max_fail_percentage 25675 1727204018.20142: done checking for max_fail_percentage 25675 1727204018.20143: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.20144: done checking to see if all hosts have failed 25675 1727204018.20144: getting the remaining hosts for this loop 25675 1727204018.20146: done getting the remaining hosts for this loop 25675 1727204018.20149: getting the next task for host managed-node2 25675 1727204018.20155: done getting next task for host managed-node2 25675 1727204018.20159: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727204018.20161: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.20175: getting variables 25675 1727204018.20180: in VariableManager get_vars() 25675 1727204018.20229: Calling all_inventory to load vars for managed-node2 25675 1727204018.20232: Calling groups_inventory to load vars for managed-node2 25675 1727204018.20234: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.20245: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.20248: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.20251: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.20811: done sending task result for task 028d2410-947f-41bd-b19d-000000000060 25675 1727204018.20815: WORKER PROCESS EXITING 25675 1727204018.21831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.23510: done with get_vars() 25675 1727204018.23537: done getting variables 25675 1727204018.23599: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.110) 0:00:37.687 ***** 25675 1727204018.23628: entering _queue_task() for managed-node2/dnf 25675 1727204018.23958: worker is 1 (out of 1 available) 25675 1727204018.24186: exiting _queue_task() for managed-node2/dnf 25675 1727204018.24197: done queuing things up, now waiting for results queue to drain 25675 1727204018.24198: waiting for pending results... 25675 1727204018.24268: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25675 1727204018.24390: in run() - task 028d2410-947f-41bd-b19d-000000000061 25675 1727204018.24411: variable 'ansible_search_path' from source: unknown 25675 1727204018.24425: variable 'ansible_search_path' from source: unknown 25675 1727204018.24467: calling self._execute() 25675 1727204018.24568: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.24585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.24599: variable 'omit' from source: magic vars 25675 1727204018.24985: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.25001: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.25210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.27851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.27928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.27966: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.28014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.28043: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.28130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.28162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.28197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.28248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.28267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.28395: variable 'ansible_distribution' from source: facts 25675 1727204018.28404: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.28423: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25675 1727204018.28551: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204018.28695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.28761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.28764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.28796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.28813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.28853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.28890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.28916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.28956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.29086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.29089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.29092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.29094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.29122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.29139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.29304: variable 'network_connections' from source: play vars 25675 1727204018.29320: variable 'profile' from source: play vars 25675 1727204018.29391: variable 'profile' from source: play vars 25675 1727204018.29401: variable 'interface' from source: set_fact 25675 1727204018.29466: variable 'interface' from source: set_fact 25675 1727204018.29567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204018.29723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204018.29769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204018.29809: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204018.29849: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204018.29901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204018.29927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204018.29974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.30063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204018.30066: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204018.30295: variable 'network_connections' from source: play vars 25675 1727204018.30305: variable 'profile' from source: play vars 25675 1727204018.30361: variable 'profile' from source: play vars 25675 1727204018.30370: variable 'interface' from source: set_fact 25675 1727204018.30438: variable 'interface' from source: set_fact 25675 1727204018.30467: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204018.30474: when evaluation is False, skipping this task 25675 1727204018.30486: _execute() done 25675 1727204018.30492: dumping result to json 25675 1727204018.30585: done dumping result, returning 25675 1727204018.30587: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000061] 25675 1727204018.30589: sending task result for task 028d2410-947f-41bd-b19d-000000000061 25675 1727204018.30664: done sending task result for task 028d2410-947f-41bd-b19d-000000000061 25675 1727204018.30667: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204018.30731: no more pending results, returning what we have 25675 1727204018.30735: results queue empty 25675 1727204018.30736: checking for any_errors_fatal 25675 1727204018.30744: done checking for any_errors_fatal 25675 1727204018.30745: checking for max_fail_percentage 25675 1727204018.30747: done checking for max_fail_percentage 25675 1727204018.30748: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.30749: done checking to see if all hosts have failed 25675 1727204018.30750: getting the remaining hosts for this loop 25675 1727204018.30751: done getting the remaining hosts for this loop 25675 1727204018.30756: getting the next task for host managed-node2 25675 1727204018.30762: done getting next task for host managed-node2 25675 1727204018.30766: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727204018.30768: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.30788: getting variables 25675 1727204018.30790: in VariableManager get_vars() 25675 1727204018.30833: Calling all_inventory to load vars for managed-node2 25675 1727204018.30836: Calling groups_inventory to load vars for managed-node2 25675 1727204018.30838: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.30850: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.30853: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.30856: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.34119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.37647: done with get_vars() 25675 1727204018.37687: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25675 1727204018.37759: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.142) 0:00:37.830 ***** 25675 1727204018.37903: entering _queue_task() for managed-node2/yum 25675 1727204018.38521: worker is 1 (out of 1 available) 25675 1727204018.38534: exiting _queue_task() for managed-node2/yum 25675 1727204018.38546: done queuing things up, now waiting for results queue to drain 25675 1727204018.38663: waiting for pending results... 25675 1727204018.39003: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25675 1727204018.39009: in run() - task 028d2410-947f-41bd-b19d-000000000062 25675 1727204018.39024: variable 'ansible_search_path' from source: unknown 25675 1727204018.39033: variable 'ansible_search_path' from source: unknown 25675 1727204018.39082: calling self._execute() 25675 1727204018.39213: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.39322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.39327: variable 'omit' from source: magic vars 25675 1727204018.39657: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.39674: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.39839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.41678: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.41724: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.41751: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.41779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.41805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.41864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.41900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.41923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.41948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.41960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.42032: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.42044: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25675 1727204018.42047: when evaluation is False, skipping this task 25675 1727204018.42050: _execute() done 25675 1727204018.42052: dumping result to json 25675 1727204018.42055: done dumping result, returning 25675 1727204018.42062: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000062] 25675 1727204018.42067: sending task result for task 028d2410-947f-41bd-b19d-000000000062 25675 1727204018.42156: done sending task result for task 028d2410-947f-41bd-b19d-000000000062 25675 1727204018.42158: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25675 1727204018.42210: no more pending results, returning what we have 25675 1727204018.42213: results queue empty 25675 1727204018.42214: checking for any_errors_fatal 25675 1727204018.42219: done checking for any_errors_fatal 25675 1727204018.42220: checking for max_fail_percentage 25675 1727204018.42222: done checking for max_fail_percentage 25675 1727204018.42223: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.42223: done checking to see if all hosts have failed 25675 1727204018.42224: getting the remaining hosts for this loop 25675 1727204018.42225: done getting the remaining hosts for this loop 25675 1727204018.42229: getting the next task for host managed-node2 25675 1727204018.42234: done getting next task for host managed-node2 25675 1727204018.42238: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727204018.42240: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.42252: getting variables 25675 1727204018.42254: in VariableManager get_vars() 25675 1727204018.42299: Calling all_inventory to load vars for managed-node2 25675 1727204018.42302: Calling groups_inventory to load vars for managed-node2 25675 1727204018.42304: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.42313: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.42316: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.42318: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.43126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.44673: done with get_vars() 25675 1727204018.44702: done getting variables 25675 1727204018.44762: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.068) 0:00:37.899 ***** 25675 1727204018.44794: entering _queue_task() for managed-node2/fail 25675 1727204018.45105: worker is 1 (out of 1 available) 25675 1727204018.45118: exiting _queue_task() for managed-node2/fail 25675 1727204018.45131: done queuing things up, now waiting for results queue to drain 25675 1727204018.45132: waiting for pending results... 25675 1727204018.45503: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25675 1727204018.45520: in run() - task 028d2410-947f-41bd-b19d-000000000063 25675 1727204018.45541: variable 'ansible_search_path' from source: unknown 25675 1727204018.45549: variable 'ansible_search_path' from source: unknown 25675 1727204018.45596: calling self._execute() 25675 1727204018.45782: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.45786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.45789: variable 'omit' from source: magic vars 25675 1727204018.46112: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.46133: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.46254: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204018.46453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.48999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.49068: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.49112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.49150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.49188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.49267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.49309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.49342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.49393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.49491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.49494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.49496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.49523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.49577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.49622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.49689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.49723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.49752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.49797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.49850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.50008: variable 'network_connections' from source: play vars 25675 1727204018.50026: variable 'profile' from source: play vars 25675 1727204018.50106: variable 'profile' from source: play vars 25675 1727204018.50144: variable 'interface' from source: set_fact 25675 1727204018.50187: variable 'interface' from source: set_fact 25675 1727204018.50285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204018.50449: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204018.50496: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204018.50579: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204018.50582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204018.50615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204018.50641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204018.50671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.50703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204018.50752: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204018.50979: variable 'network_connections' from source: play vars 25675 1727204018.51005: variable 'profile' from source: play vars 25675 1727204018.51052: variable 'profile' from source: play vars 25675 1727204018.51060: variable 'interface' from source: set_fact 25675 1727204018.51222: variable 'interface' from source: set_fact 25675 1727204018.51225: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204018.51227: when evaluation is False, skipping this task 25675 1727204018.51229: _execute() done 25675 1727204018.51231: dumping result to json 25675 1727204018.51233: done dumping result, returning 25675 1727204018.51235: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000063] 25675 1727204018.51245: sending task result for task 028d2410-947f-41bd-b19d-000000000063 25675 1727204018.51313: done sending task result for task 028d2410-947f-41bd-b19d-000000000063 25675 1727204018.51316: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204018.51366: no more pending results, returning what we have 25675 1727204018.51369: results queue empty 25675 1727204018.51370: checking for any_errors_fatal 25675 1727204018.51377: done checking for any_errors_fatal 25675 1727204018.51378: checking for max_fail_percentage 25675 1727204018.51380: done checking for max_fail_percentage 25675 1727204018.51381: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.51382: done checking to see if all hosts have failed 25675 1727204018.51383: getting the remaining hosts for this loop 25675 1727204018.51384: done getting the remaining hosts for this loop 25675 1727204018.51388: getting the next task for host managed-node2 25675 1727204018.51394: done getting next task for host managed-node2 25675 1727204018.51398: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25675 1727204018.51400: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.51414: getting variables 25675 1727204018.51416: in VariableManager get_vars() 25675 1727204018.51455: Calling all_inventory to load vars for managed-node2 25675 1727204018.51458: Calling groups_inventory to load vars for managed-node2 25675 1727204018.51460: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.51709: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.51713: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.51717: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.53223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.54761: done with get_vars() 25675 1727204018.54787: done getting variables 25675 1727204018.54845: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.100) 0:00:37.999 ***** 25675 1727204018.54879: entering _queue_task() for managed-node2/package 25675 1727204018.55198: worker is 1 (out of 1 available) 25675 1727204018.55210: exiting _queue_task() for managed-node2/package 25675 1727204018.55223: done queuing things up, now waiting for results queue to drain 25675 1727204018.55224: waiting for pending results... 25675 1727204018.55601: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 25675 1727204018.55606: in run() - task 028d2410-947f-41bd-b19d-000000000064 25675 1727204018.55627: variable 'ansible_search_path' from source: unknown 25675 1727204018.55635: variable 'ansible_search_path' from source: unknown 25675 1727204018.55674: calling self._execute() 25675 1727204018.55778: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.55796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.55881: variable 'omit' from source: magic vars 25675 1727204018.56177: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.56194: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.56396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204018.56661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204018.56715: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204018.56753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204018.56831: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204018.56948: variable 'network_packages' from source: role '' defaults 25675 1727204018.57066: variable '__network_provider_setup' from source: role '' defaults 25675 1727204018.57085: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204018.57281: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204018.57284: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204018.57287: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204018.57424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.59395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.59465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.59511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.59546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.59681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.59687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.59702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.59729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.59766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.59785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.59831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.59856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.59887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.59932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.59949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.60482: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727204018.60485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.60487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.60489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.60613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.60702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.60800: variable 'ansible_python' from source: facts 25675 1727204018.60945: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727204018.61138: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204018.61332: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204018.61599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.61706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.61733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.61773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.61981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.61984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.61994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.62223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.62226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.62229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.62430: variable 'network_connections' from source: play vars 25675 1727204018.62553: variable 'profile' from source: play vars 25675 1727204018.62658: variable 'profile' from source: play vars 25675 1727204018.62779: variable 'interface' from source: set_fact 25675 1727204018.62850: variable 'interface' from source: set_fact 25675 1727204018.63009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204018.63035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204018.63065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.63107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204018.63150: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204018.63464: variable 'network_connections' from source: play vars 25675 1727204018.63468: variable 'profile' from source: play vars 25675 1727204018.63577: variable 'profile' from source: play vars 25675 1727204018.63588: variable 'interface' from source: set_fact 25675 1727204018.63660: variable 'interface' from source: set_fact 25675 1727204018.63696: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727204018.63778: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204018.64111: variable 'network_connections' from source: play vars 25675 1727204018.64114: variable 'profile' from source: play vars 25675 1727204018.64180: variable 'profile' from source: play vars 25675 1727204018.64207: variable 'interface' from source: set_fact 25675 1727204018.64278: variable 'interface' from source: set_fact 25675 1727204018.64355: variable '__network_packages_default_team' from source: role '' defaults 25675 1727204018.64585: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204018.64694: variable 'network_connections' from source: play vars 25675 1727204018.64697: variable 'profile' from source: play vars 25675 1727204018.64766: variable 'profile' from source: play vars 25675 1727204018.64769: variable 'interface' from source: set_fact 25675 1727204018.64888: variable 'interface' from source: set_fact 25675 1727204018.64939: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204018.65192: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204018.65198: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204018.65254: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204018.65477: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727204018.65969: variable 'network_connections' from source: play vars 25675 1727204018.65973: variable 'profile' from source: play vars 25675 1727204018.66035: variable 'profile' from source: play vars 25675 1727204018.66043: variable 'interface' from source: set_fact 25675 1727204018.66109: variable 'interface' from source: set_fact 25675 1727204018.66117: variable 'ansible_distribution' from source: facts 25675 1727204018.66120: variable '__network_rh_distros' from source: role '' defaults 25675 1727204018.66126: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.66140: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727204018.66338: variable 'ansible_distribution' from source: facts 25675 1727204018.66341: variable '__network_rh_distros' from source: role '' defaults 25675 1727204018.66346: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.66360: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727204018.66543: variable 'ansible_distribution' from source: facts 25675 1727204018.66547: variable '__network_rh_distros' from source: role '' defaults 25675 1727204018.66553: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.66603: variable 'network_provider' from source: set_fact 25675 1727204018.66618: variable 'ansible_facts' from source: unknown 25675 1727204018.67556: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25675 1727204018.67562: when evaluation is False, skipping this task 25675 1727204018.67567: _execute() done 25675 1727204018.67570: dumping result to json 25675 1727204018.67574: done dumping result, returning 25675 1727204018.67588: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-41bd-b19d-000000000064] 25675 1727204018.67591: sending task result for task 028d2410-947f-41bd-b19d-000000000064 25675 1727204018.67780: done sending task result for task 028d2410-947f-41bd-b19d-000000000064 25675 1727204018.67783: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25675 1727204018.67828: no more pending results, returning what we have 25675 1727204018.67831: results queue empty 25675 1727204018.67832: checking for any_errors_fatal 25675 1727204018.67839: done checking for any_errors_fatal 25675 1727204018.67840: checking for max_fail_percentage 25675 1727204018.67842: done checking for max_fail_percentage 25675 1727204018.67843: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.67843: done checking to see if all hosts have failed 25675 1727204018.67844: getting the remaining hosts for this loop 25675 1727204018.67845: done getting the remaining hosts for this loop 25675 1727204018.67848: getting the next task for host managed-node2 25675 1727204018.67853: done getting next task for host managed-node2 25675 1727204018.67856: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727204018.67858: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.67871: getting variables 25675 1727204018.67872: in VariableManager get_vars() 25675 1727204018.67911: Calling all_inventory to load vars for managed-node2 25675 1727204018.67914: Calling groups_inventory to load vars for managed-node2 25675 1727204018.67916: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.67929: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.67932: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.67934: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.69391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.71762: done with get_vars() 25675 1727204018.71795: done getting variables 25675 1727204018.71860: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.170) 0:00:38.170 ***** 25675 1727204018.71898: entering _queue_task() for managed-node2/package 25675 1727204018.72385: worker is 1 (out of 1 available) 25675 1727204018.72395: exiting _queue_task() for managed-node2/package 25675 1727204018.72407: done queuing things up, now waiting for results queue to drain 25675 1727204018.72408: waiting for pending results... 25675 1727204018.72647: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25675 1727204018.72711: in run() - task 028d2410-947f-41bd-b19d-000000000065 25675 1727204018.72750: variable 'ansible_search_path' from source: unknown 25675 1727204018.72758: variable 'ansible_search_path' from source: unknown 25675 1727204018.72805: calling self._execute() 25675 1727204018.72942: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.72962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.72984: variable 'omit' from source: magic vars 25675 1727204018.73501: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.73506: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.73557: variable 'network_state' from source: role '' defaults 25675 1727204018.73572: Evaluated conditional (network_state != {}): False 25675 1727204018.73585: when evaluation is False, skipping this task 25675 1727204018.73593: _execute() done 25675 1727204018.73600: dumping result to json 25675 1727204018.73613: done dumping result, returning 25675 1727204018.73630: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000065] 25675 1727204018.73717: sending task result for task 028d2410-947f-41bd-b19d-000000000065 25675 1727204018.73795: done sending task result for task 028d2410-947f-41bd-b19d-000000000065 25675 1727204018.73799: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204018.73850: no more pending results, returning what we have 25675 1727204018.73854: results queue empty 25675 1727204018.73855: checking for any_errors_fatal 25675 1727204018.73862: done checking for any_errors_fatal 25675 1727204018.73862: checking for max_fail_percentage 25675 1727204018.73864: done checking for max_fail_percentage 25675 1727204018.73865: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.73866: done checking to see if all hosts have failed 25675 1727204018.73867: getting the remaining hosts for this loop 25675 1727204018.73868: done getting the remaining hosts for this loop 25675 1727204018.73873: getting the next task for host managed-node2 25675 1727204018.73884: done getting next task for host managed-node2 25675 1727204018.73888: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727204018.73891: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.73907: getting variables 25675 1727204018.73909: in VariableManager get_vars() 25675 1727204018.73952: Calling all_inventory to load vars for managed-node2 25675 1727204018.73955: Calling groups_inventory to load vars for managed-node2 25675 1727204018.73957: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.73970: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.73973: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.74193: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.76252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.78208: done with get_vars() 25675 1727204018.78313: done getting variables 25675 1727204018.78397: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.065) 0:00:38.235 ***** 25675 1727204018.78450: entering _queue_task() for managed-node2/package 25675 1727204018.78821: worker is 1 (out of 1 available) 25675 1727204018.78837: exiting _queue_task() for managed-node2/package 25675 1727204018.78965: done queuing things up, now waiting for results queue to drain 25675 1727204018.78966: waiting for pending results... 25675 1727204018.79299: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25675 1727204018.79305: in run() - task 028d2410-947f-41bd-b19d-000000000066 25675 1727204018.79308: variable 'ansible_search_path' from source: unknown 25675 1727204018.79311: variable 'ansible_search_path' from source: unknown 25675 1727204018.79352: calling self._execute() 25675 1727204018.79460: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.79478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.79504: variable 'omit' from source: magic vars 25675 1727204018.79944: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.79948: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.80043: variable 'network_state' from source: role '' defaults 25675 1727204018.80070: Evaluated conditional (network_state != {}): False 25675 1727204018.80081: when evaluation is False, skipping this task 25675 1727204018.80089: _execute() done 25675 1727204018.80096: dumping result to json 25675 1727204018.80106: done dumping result, returning 25675 1727204018.80117: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-41bd-b19d-000000000066] 25675 1727204018.80162: sending task result for task 028d2410-947f-41bd-b19d-000000000066 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204018.80315: no more pending results, returning what we have 25675 1727204018.80320: results queue empty 25675 1727204018.80321: checking for any_errors_fatal 25675 1727204018.80331: done checking for any_errors_fatal 25675 1727204018.80332: checking for max_fail_percentage 25675 1727204018.80334: done checking for max_fail_percentage 25675 1727204018.80335: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.80336: done checking to see if all hosts have failed 25675 1727204018.80336: getting the remaining hosts for this loop 25675 1727204018.80338: done getting the remaining hosts for this loop 25675 1727204018.80342: getting the next task for host managed-node2 25675 1727204018.80353: done getting next task for host managed-node2 25675 1727204018.80357: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727204018.80359: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.80379: getting variables 25675 1727204018.80384: in VariableManager get_vars() 25675 1727204018.80423: Calling all_inventory to load vars for managed-node2 25675 1727204018.80426: Calling groups_inventory to load vars for managed-node2 25675 1727204018.80428: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.80441: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.80444: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.80447: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.81101: done sending task result for task 028d2410-947f-41bd-b19d-000000000066 25675 1727204018.81105: WORKER PROCESS EXITING 25675 1727204018.82487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.84084: done with get_vars() 25675 1727204018.84109: done getting variables 25675 1727204018.84171: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.057) 0:00:38.293 ***** 25675 1727204018.84206: entering _queue_task() for managed-node2/service 25675 1727204018.84533: worker is 1 (out of 1 available) 25675 1727204018.84545: exiting _queue_task() for managed-node2/service 25675 1727204018.84559: done queuing things up, now waiting for results queue to drain 25675 1727204018.84560: waiting for pending results... 25675 1727204018.84852: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25675 1727204018.84983: in run() - task 028d2410-947f-41bd-b19d-000000000067 25675 1727204018.85004: variable 'ansible_search_path' from source: unknown 25675 1727204018.85011: variable 'ansible_search_path' from source: unknown 25675 1727204018.85056: calling self._execute() 25675 1727204018.85160: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.85171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.85189: variable 'omit' from source: magic vars 25675 1727204018.85565: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.85589: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.85718: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204018.85928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.88200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.88274: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.88383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.88386: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.88399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.88494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.88548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.88585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.88638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.88659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.88714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.88752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.88786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.88982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.88986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.88988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.88990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.88992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.88994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.89005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.89201: variable 'network_connections' from source: play vars 25675 1727204018.89226: variable 'profile' from source: play vars 25675 1727204018.89302: variable 'profile' from source: play vars 25675 1727204018.89312: variable 'interface' from source: set_fact 25675 1727204018.89386: variable 'interface' from source: set_fact 25675 1727204018.89471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204018.89658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204018.89703: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204018.89766: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204018.89780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204018.89818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204018.89840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204018.89865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.89986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204018.89989: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204018.90200: variable 'network_connections' from source: play vars 25675 1727204018.90213: variable 'profile' from source: play vars 25675 1727204018.90280: variable 'profile' from source: play vars 25675 1727204018.90290: variable 'interface' from source: set_fact 25675 1727204018.90359: variable 'interface' from source: set_fact 25675 1727204018.90393: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25675 1727204018.90401: when evaluation is False, skipping this task 25675 1727204018.90417: _execute() done 25675 1727204018.90419: dumping result to json 25675 1727204018.90528: done dumping result, returning 25675 1727204018.90531: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-41bd-b19d-000000000067] 25675 1727204018.90542: sending task result for task 028d2410-947f-41bd-b19d-000000000067 25675 1727204018.90614: done sending task result for task 028d2410-947f-41bd-b19d-000000000067 25675 1727204018.90617: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25675 1727204018.90668: no more pending results, returning what we have 25675 1727204018.90673: results queue empty 25675 1727204018.90674: checking for any_errors_fatal 25675 1727204018.90684: done checking for any_errors_fatal 25675 1727204018.90685: checking for max_fail_percentage 25675 1727204018.90687: done checking for max_fail_percentage 25675 1727204018.90688: checking to see if all hosts have failed and the running result is not ok 25675 1727204018.90689: done checking to see if all hosts have failed 25675 1727204018.90689: getting the remaining hosts for this loop 25675 1727204018.90691: done getting the remaining hosts for this loop 25675 1727204018.90695: getting the next task for host managed-node2 25675 1727204018.90702: done getting next task for host managed-node2 25675 1727204018.90706: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727204018.90708: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204018.90721: getting variables 25675 1727204018.90723: in VariableManager get_vars() 25675 1727204018.90762: Calling all_inventory to load vars for managed-node2 25675 1727204018.90765: Calling groups_inventory to load vars for managed-node2 25675 1727204018.90767: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204018.90884: Calling all_plugins_play to load vars for managed-node2 25675 1727204018.90893: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204018.90897: Calling groups_plugins_play to load vars for managed-node2 25675 1727204018.92525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204018.94255: done with get_vars() 25675 1727204018.94292: done getting variables 25675 1727204018.94351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:38 -0400 (0:00:00.101) 0:00:38.395 ***** 25675 1727204018.94390: entering _queue_task() for managed-node2/service 25675 1727204018.94760: worker is 1 (out of 1 available) 25675 1727204018.94772: exiting _queue_task() for managed-node2/service 25675 1727204018.94788: done queuing things up, now waiting for results queue to drain 25675 1727204018.94880: waiting for pending results... 25675 1727204018.95199: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25675 1727204018.95285: in run() - task 028d2410-947f-41bd-b19d-000000000068 25675 1727204018.95294: variable 'ansible_search_path' from source: unknown 25675 1727204018.95297: variable 'ansible_search_path' from source: unknown 25675 1727204018.95299: calling self._execute() 25675 1727204018.95387: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204018.95406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204018.95513: variable 'omit' from source: magic vars 25675 1727204018.95828: variable 'ansible_distribution_major_version' from source: facts 25675 1727204018.95854: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204018.96034: variable 'network_provider' from source: set_fact 25675 1727204018.96054: variable 'network_state' from source: role '' defaults 25675 1727204018.96082: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25675 1727204018.96098: variable 'omit' from source: magic vars 25675 1727204018.96145: variable 'omit' from source: magic vars 25675 1727204018.96279: variable 'network_service_name' from source: role '' defaults 25675 1727204018.96283: variable 'network_service_name' from source: role '' defaults 25675 1727204018.96403: variable '__network_provider_setup' from source: role '' defaults 25675 1727204018.96417: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204018.96498: variable '__network_service_name_default_nm' from source: role '' defaults 25675 1727204018.96584: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204018.96587: variable '__network_packages_default_nm' from source: role '' defaults 25675 1727204018.96845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204018.99431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204018.99537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204018.99552: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204018.99596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204018.99628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204018.99721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.99765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.99982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204018.99984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204018.99986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204018.99991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204018.99993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204018.99995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.00028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.00047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.00314: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25675 1727204019.00450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.00483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.00514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.00563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.00589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.00691: variable 'ansible_python' from source: facts 25675 1727204019.00718: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25675 1727204019.00816: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204019.00911: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204019.01038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.01084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.01107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.01148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.01204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.01226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.01264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.01298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.01348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.01421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.01530: variable 'network_connections' from source: play vars 25675 1727204019.01537: variable 'profile' from source: play vars 25675 1727204019.01594: variable 'profile' from source: play vars 25675 1727204019.01599: variable 'interface' from source: set_fact 25675 1727204019.01643: variable 'interface' from source: set_fact 25675 1727204019.01724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204019.02019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204019.02023: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204019.02025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204019.02028: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204019.02081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204019.02106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204019.02146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.02170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204019.02218: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204019.02494: variable 'network_connections' from source: play vars 25675 1727204019.02500: variable 'profile' from source: play vars 25675 1727204019.02569: variable 'profile' from source: play vars 25675 1727204019.02680: variable 'interface' from source: set_fact 25675 1727204019.02685: variable 'interface' from source: set_fact 25675 1727204019.02689: variable '__network_packages_default_wireless' from source: role '' defaults 25675 1727204019.02735: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204019.03003: variable 'network_connections' from source: play vars 25675 1727204019.03007: variable 'profile' from source: play vars 25675 1727204019.03075: variable 'profile' from source: play vars 25675 1727204019.03083: variable 'interface' from source: set_fact 25675 1727204019.03135: variable 'interface' from source: set_fact 25675 1727204019.03152: variable '__network_packages_default_team' from source: role '' defaults 25675 1727204019.03227: variable '__network_team_connections_defined' from source: role '' defaults 25675 1727204019.03683: variable 'network_connections' from source: play vars 25675 1727204019.03686: variable 'profile' from source: play vars 25675 1727204019.03688: variable 'profile' from source: play vars 25675 1727204019.03689: variable 'interface' from source: set_fact 25675 1727204019.03745: variable 'interface' from source: set_fact 25675 1727204019.03805: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204019.03866: variable '__network_service_name_default_initscripts' from source: role '' defaults 25675 1727204019.03881: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204019.03941: variable '__network_packages_default_initscripts' from source: role '' defaults 25675 1727204019.04229: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25675 1727204019.04587: variable 'network_connections' from source: play vars 25675 1727204019.04590: variable 'profile' from source: play vars 25675 1727204019.04627: variable 'profile' from source: play vars 25675 1727204019.04630: variable 'interface' from source: set_fact 25675 1727204019.04681: variable 'interface' from source: set_fact 25675 1727204019.04692: variable 'ansible_distribution' from source: facts 25675 1727204019.04698: variable '__network_rh_distros' from source: role '' defaults 25675 1727204019.04701: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.04708: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25675 1727204019.04848: variable 'ansible_distribution' from source: facts 25675 1727204019.04851: variable '__network_rh_distros' from source: role '' defaults 25675 1727204019.04855: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.04867: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25675 1727204019.05006: variable 'ansible_distribution' from source: facts 25675 1727204019.05016: variable '__network_rh_distros' from source: role '' defaults 25675 1727204019.05021: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.05052: variable 'network_provider' from source: set_fact 25675 1727204019.05073: variable 'omit' from source: magic vars 25675 1727204019.05097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204019.05117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204019.05142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204019.05166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204019.05170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204019.05193: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204019.05196: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.05198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.05282: Set connection var ansible_shell_type to sh 25675 1727204019.05288: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204019.05293: Set connection var ansible_timeout to 10 25675 1727204019.05298: Set connection var ansible_pipelining to False 25675 1727204019.05307: Set connection var ansible_shell_executable to /bin/sh 25675 1727204019.05310: Set connection var ansible_connection to ssh 25675 1727204019.05346: variable 'ansible_shell_executable' from source: unknown 25675 1727204019.05350: variable 'ansible_connection' from source: unknown 25675 1727204019.05353: variable 'ansible_module_compression' from source: unknown 25675 1727204019.05356: variable 'ansible_shell_type' from source: unknown 25675 1727204019.05358: variable 'ansible_shell_executable' from source: unknown 25675 1727204019.05360: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.05366: variable 'ansible_pipelining' from source: unknown 25675 1727204019.05368: variable 'ansible_timeout' from source: unknown 25675 1727204019.05370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.05472: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204019.05479: variable 'omit' from source: magic vars 25675 1727204019.05483: starting attempt loop 25675 1727204019.05485: running the handler 25675 1727204019.05605: variable 'ansible_facts' from source: unknown 25675 1727204019.06336: _low_level_execute_command(): starting 25675 1727204019.06339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204019.06990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204019.06993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.06996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204019.06998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204019.07000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204019.07087: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204019.07104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.07208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.09234: stdout chunk (state=3): >>>/root <<< 25675 1727204019.09237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204019.09240: stdout chunk (state=3): >>><<< 25675 1727204019.09242: stderr chunk (state=3): >>><<< 25675 1727204019.09245: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204019.09247: _low_level_execute_command(): starting 25675 1727204019.09250: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286 `" && echo ansible-tmp-1727204019.0922759-28830-247211995867286="` echo /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286 `" ) && sleep 0' 25675 1727204019.10451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204019.10456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204019.10501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.10699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204019.10703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.10806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.12890: stdout chunk (state=3): >>>ansible-tmp-1727204019.0922759-28830-247211995867286=/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286 <<< 25675 1727204019.13008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204019.13043: stderr chunk (state=3): >>><<< 25675 1727204019.13051: stdout chunk (state=3): >>><<< 25675 1727204019.13382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204019.0922759-28830-247211995867286=/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204019.13385: variable 'ansible_module_compression' from source: unknown 25675 1727204019.13388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 25675 1727204019.13390: variable 'ansible_facts' from source: unknown 25675 1727204019.13752: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py 25675 1727204019.14204: Sending initial data 25675 1727204019.14214: Sent initial data (156 bytes) 25675 1727204019.15404: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204019.15684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204019.15702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.15853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.17560: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204019.17630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204019.17703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpe475cbvo /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py <<< 25675 1727204019.17791: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py" <<< 25675 1727204019.17827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpe475cbvo" to remote "/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py" <<< 25675 1727204019.17840: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py" <<< 25675 1727204019.20714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204019.20731: stderr chunk (state=3): >>><<< 25675 1727204019.20741: stdout chunk (state=3): >>><<< 25675 1727204019.20779: done transferring module to remote 25675 1727204019.20812: _low_level_execute_command(): starting 25675 1727204019.20822: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/ /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py && sleep 0' 25675 1727204019.21418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204019.21432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.21447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204019.21465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204019.21489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204019.21503: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204019.21516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204019.21534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204019.21545: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204019.21556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204019.21568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.21596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204019.21674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204019.21701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.21814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.23719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204019.23764: stdout chunk (state=3): >>><<< 25675 1727204019.23810: stderr chunk (state=3): >>><<< 25675 1727204019.23851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204019.23984: _low_level_execute_command(): starting 25675 1727204019.23988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/AnsiballZ_systemd.py && sleep 0' 25675 1727204019.24826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204019.24840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.24855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204019.24871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204019.24893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204019.24909: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204019.24994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204019.25013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204019.25211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204019.25234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.25337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.54317: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4472832", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303178240", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "778130000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 25675 1727204019.54349: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25675 1727204019.56251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204019.56265: stdout chunk (state=3): >>><<< 25675 1727204019.56289: stderr chunk (state=3): >>><<< 25675 1727204019.56484: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4472832", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303178240", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "778130000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204019.56524: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204019.56550: _low_level_execute_command(): starting 25675 1727204019.56559: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204019.0922759-28830-247211995867286/ > /dev/null 2>&1 && sleep 0' 25675 1727204019.57119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204019.57132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204019.57149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204019.57193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204019.57210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204019.57287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204019.59179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204019.59182: stdout chunk (state=3): >>><<< 25675 1727204019.59185: stderr chunk (state=3): >>><<< 25675 1727204019.59383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204019.59386: handler run complete 25675 1727204019.59389: attempt loop complete, returning result 25675 1727204019.59391: _execute() done 25675 1727204019.59392: dumping result to json 25675 1727204019.59394: done dumping result, returning 25675 1727204019.59396: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-41bd-b19d-000000000068] 25675 1727204019.59398: sending task result for task 028d2410-947f-41bd-b19d-000000000068 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204019.59649: no more pending results, returning what we have 25675 1727204019.59653: results queue empty 25675 1727204019.59653: checking for any_errors_fatal 25675 1727204019.59660: done checking for any_errors_fatal 25675 1727204019.59660: checking for max_fail_percentage 25675 1727204019.59662: done checking for max_fail_percentage 25675 1727204019.59663: checking to see if all hosts have failed and the running result is not ok 25675 1727204019.59664: done checking to see if all hosts have failed 25675 1727204019.59664: getting the remaining hosts for this loop 25675 1727204019.59665: done getting the remaining hosts for this loop 25675 1727204019.59669: getting the next task for host managed-node2 25675 1727204019.59674: done getting next task for host managed-node2 25675 1727204019.59679: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727204019.59681: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204019.59690: getting variables 25675 1727204019.59692: in VariableManager get_vars() 25675 1727204019.59765: Calling all_inventory to load vars for managed-node2 25675 1727204019.59768: Calling groups_inventory to load vars for managed-node2 25675 1727204019.59770: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204019.59778: done sending task result for task 028d2410-947f-41bd-b19d-000000000068 25675 1727204019.59781: WORKER PROCESS EXITING 25675 1727204019.59790: Calling all_plugins_play to load vars for managed-node2 25675 1727204019.59803: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204019.59807: Calling groups_plugins_play to load vars for managed-node2 25675 1727204019.61229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204019.62952: done with get_vars() 25675 1727204019.62993: done getting variables 25675 1727204019.63059: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:39 -0400 (0:00:00.687) 0:00:39.082 ***** 25675 1727204019.63107: entering _queue_task() for managed-node2/service 25675 1727204019.63526: worker is 1 (out of 1 available) 25675 1727204019.63539: exiting _queue_task() for managed-node2/service 25675 1727204019.63552: done queuing things up, now waiting for results queue to drain 25675 1727204019.63553: waiting for pending results... 25675 1727204019.63846: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25675 1727204019.63979: in run() - task 028d2410-947f-41bd-b19d-000000000069 25675 1727204019.64013: variable 'ansible_search_path' from source: unknown 25675 1727204019.64112: variable 'ansible_search_path' from source: unknown 25675 1727204019.64116: calling self._execute() 25675 1727204019.64178: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.64191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.64204: variable 'omit' from source: magic vars 25675 1727204019.64608: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.64624: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204019.64752: variable 'network_provider' from source: set_fact 25675 1727204019.64773: Evaluated conditional (network_provider == "nm"): True 25675 1727204019.64870: variable '__network_wpa_supplicant_required' from source: role '' defaults 25675 1727204019.64980: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25675 1727204019.65141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204019.67364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204019.67450: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204019.67505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204019.67545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204019.67582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204019.67688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.67735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.67768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.67824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.67844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.67895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.67987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.67990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.68000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.68020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.68063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.68102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.68130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.68172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.68200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.68357: variable 'network_connections' from source: play vars 25675 1727204019.68378: variable 'profile' from source: play vars 25675 1727204019.68581: variable 'profile' from source: play vars 25675 1727204019.68584: variable 'interface' from source: set_fact 25675 1727204019.68586: variable 'interface' from source: set_fact 25675 1727204019.68626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25675 1727204019.68814: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25675 1727204019.68854: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25675 1727204019.68888: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25675 1727204019.68929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25675 1727204019.68973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25675 1727204019.69002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25675 1727204019.69038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.69068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25675 1727204019.69141: variable '__network_wireless_connections_defined' from source: role '' defaults 25675 1727204019.69401: variable 'network_connections' from source: play vars 25675 1727204019.69411: variable 'profile' from source: play vars 25675 1727204019.69575: variable 'profile' from source: play vars 25675 1727204019.69581: variable 'interface' from source: set_fact 25675 1727204019.69583: variable 'interface' from source: set_fact 25675 1727204019.69598: Evaluated conditional (__network_wpa_supplicant_required): False 25675 1727204019.69606: when evaluation is False, skipping this task 25675 1727204019.69613: _execute() done 25675 1727204019.69629: dumping result to json 25675 1727204019.69637: done dumping result, returning 25675 1727204019.69648: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-41bd-b19d-000000000069] 25675 1727204019.69682: sending task result for task 028d2410-947f-41bd-b19d-000000000069 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25675 1727204019.69850: no more pending results, returning what we have 25675 1727204019.69853: results queue empty 25675 1727204019.69854: checking for any_errors_fatal 25675 1727204019.69873: done checking for any_errors_fatal 25675 1727204019.69874: checking for max_fail_percentage 25675 1727204019.69879: done checking for max_fail_percentage 25675 1727204019.69880: checking to see if all hosts have failed and the running result is not ok 25675 1727204019.69881: done checking to see if all hosts have failed 25675 1727204019.69881: getting the remaining hosts for this loop 25675 1727204019.69882: done getting the remaining hosts for this loop 25675 1727204019.69887: getting the next task for host managed-node2 25675 1727204019.69894: done getting next task for host managed-node2 25675 1727204019.69898: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25675 1727204019.69900: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204019.69914: getting variables 25675 1727204019.69915: in VariableManager get_vars() 25675 1727204019.69956: Calling all_inventory to load vars for managed-node2 25675 1727204019.69959: Calling groups_inventory to load vars for managed-node2 25675 1727204019.69961: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204019.69973: Calling all_plugins_play to load vars for managed-node2 25675 1727204019.70195: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204019.70202: Calling groups_plugins_play to load vars for managed-node2 25675 1727204019.70814: done sending task result for task 028d2410-947f-41bd-b19d-000000000069 25675 1727204019.70818: WORKER PROCESS EXITING 25675 1727204019.71804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204019.73670: done with get_vars() 25675 1727204019.73695: done getting variables 25675 1727204019.73756: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:39 -0400 (0:00:00.106) 0:00:39.189 ***** 25675 1727204019.73796: entering _queue_task() for managed-node2/service 25675 1727204019.74169: worker is 1 (out of 1 available) 25675 1727204019.74285: exiting _queue_task() for managed-node2/service 25675 1727204019.74298: done queuing things up, now waiting for results queue to drain 25675 1727204019.74300: waiting for pending results... 25675 1727204019.74501: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 25675 1727204019.74613: in run() - task 028d2410-947f-41bd-b19d-00000000006a 25675 1727204019.74639: variable 'ansible_search_path' from source: unknown 25675 1727204019.74655: variable 'ansible_search_path' from source: unknown 25675 1727204019.74702: calling self._execute() 25675 1727204019.74816: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.74827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.74840: variable 'omit' from source: magic vars 25675 1727204019.75246: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.75263: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204019.75385: variable 'network_provider' from source: set_fact 25675 1727204019.75396: Evaluated conditional (network_provider == "initscripts"): False 25675 1727204019.75404: when evaluation is False, skipping this task 25675 1727204019.75418: _execute() done 25675 1727204019.75425: dumping result to json 25675 1727204019.75432: done dumping result, returning 25675 1727204019.75441: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-41bd-b19d-00000000006a] 25675 1727204019.75450: sending task result for task 028d2410-947f-41bd-b19d-00000000006a 25675 1727204019.75586: done sending task result for task 028d2410-947f-41bd-b19d-00000000006a 25675 1727204019.75589: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25675 1727204019.75671: no more pending results, returning what we have 25675 1727204019.75677: results queue empty 25675 1727204019.75678: checking for any_errors_fatal 25675 1727204019.75688: done checking for any_errors_fatal 25675 1727204019.75688: checking for max_fail_percentage 25675 1727204019.75690: done checking for max_fail_percentage 25675 1727204019.75691: checking to see if all hosts have failed and the running result is not ok 25675 1727204019.75692: done checking to see if all hosts have failed 25675 1727204019.75693: getting the remaining hosts for this loop 25675 1727204019.75694: done getting the remaining hosts for this loop 25675 1727204019.75699: getting the next task for host managed-node2 25675 1727204019.75707: done getting next task for host managed-node2 25675 1727204019.75711: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727204019.75713: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204019.75729: getting variables 25675 1727204019.75733: in VariableManager get_vars() 25675 1727204019.75768: Calling all_inventory to load vars for managed-node2 25675 1727204019.75771: Calling groups_inventory to load vars for managed-node2 25675 1727204019.75773: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204019.75786: Calling all_plugins_play to load vars for managed-node2 25675 1727204019.75790: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204019.75792: Calling groups_plugins_play to load vars for managed-node2 25675 1727204019.77354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204019.78911: done with get_vars() 25675 1727204019.78934: done getting variables 25675 1727204019.78999: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:39 -0400 (0:00:00.052) 0:00:39.241 ***** 25675 1727204019.79031: entering _queue_task() for managed-node2/copy 25675 1727204019.79370: worker is 1 (out of 1 available) 25675 1727204019.79481: exiting _queue_task() for managed-node2/copy 25675 1727204019.79493: done queuing things up, now waiting for results queue to drain 25675 1727204019.79494: waiting for pending results... 25675 1727204019.79733: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25675 1727204019.79829: in run() - task 028d2410-947f-41bd-b19d-00000000006b 25675 1727204019.79837: variable 'ansible_search_path' from source: unknown 25675 1727204019.79841: variable 'ansible_search_path' from source: unknown 25675 1727204019.79873: calling self._execute() 25675 1727204019.79985: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.79988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.79992: variable 'omit' from source: magic vars 25675 1727204019.80267: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.80277: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204019.80356: variable 'network_provider' from source: set_fact 25675 1727204019.80360: Evaluated conditional (network_provider == "initscripts"): False 25675 1727204019.80362: when evaluation is False, skipping this task 25675 1727204019.80365: _execute() done 25675 1727204019.80373: dumping result to json 25675 1727204019.80381: done dumping result, returning 25675 1727204019.80388: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-41bd-b19d-00000000006b] 25675 1727204019.80391: sending task result for task 028d2410-947f-41bd-b19d-00000000006b 25675 1727204019.80472: done sending task result for task 028d2410-947f-41bd-b19d-00000000006b 25675 1727204019.80475: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25675 1727204019.80533: no more pending results, returning what we have 25675 1727204019.80537: results queue empty 25675 1727204019.80537: checking for any_errors_fatal 25675 1727204019.80543: done checking for any_errors_fatal 25675 1727204019.80544: checking for max_fail_percentage 25675 1727204019.80546: done checking for max_fail_percentage 25675 1727204019.80547: checking to see if all hosts have failed and the running result is not ok 25675 1727204019.80547: done checking to see if all hosts have failed 25675 1727204019.80548: getting the remaining hosts for this loop 25675 1727204019.80549: done getting the remaining hosts for this loop 25675 1727204019.80553: getting the next task for host managed-node2 25675 1727204019.80559: done getting next task for host managed-node2 25675 1727204019.80562: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727204019.80564: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204019.80580: getting variables 25675 1727204019.80582: in VariableManager get_vars() 25675 1727204019.80617: Calling all_inventory to load vars for managed-node2 25675 1727204019.80620: Calling groups_inventory to load vars for managed-node2 25675 1727204019.80621: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204019.80630: Calling all_plugins_play to load vars for managed-node2 25675 1727204019.80632: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204019.80634: Calling groups_plugins_play to load vars for managed-node2 25675 1727204019.81530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204019.82756: done with get_vars() 25675 1727204019.82780: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:39 -0400 (0:00:00.038) 0:00:39.279 ***** 25675 1727204019.82855: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727204019.83156: worker is 1 (out of 1 available) 25675 1727204019.83167: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 25675 1727204019.83183: done queuing things up, now waiting for results queue to drain 25675 1727204019.83185: waiting for pending results... 25675 1727204019.83754: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25675 1727204019.83958: in run() - task 028d2410-947f-41bd-b19d-00000000006c 25675 1727204019.83962: variable 'ansible_search_path' from source: unknown 25675 1727204019.83964: variable 'ansible_search_path' from source: unknown 25675 1727204019.83972: calling self._execute() 25675 1727204019.84073: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.84086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.84177: variable 'omit' from source: magic vars 25675 1727204019.84543: variable 'ansible_distribution_major_version' from source: facts 25675 1727204019.84554: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204019.84569: variable 'omit' from source: magic vars 25675 1727204019.84597: variable 'omit' from source: magic vars 25675 1727204019.84720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25675 1727204019.86408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25675 1727204019.86482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25675 1727204019.86534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25675 1727204019.86574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25675 1727204019.86651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25675 1727204019.86992: variable 'network_provider' from source: set_fact 25675 1727204019.87281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25675 1727204019.87285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25675 1727204019.87296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25675 1727204019.87336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25675 1727204019.87350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25675 1727204019.87426: variable 'omit' from source: magic vars 25675 1727204019.87538: variable 'omit' from source: magic vars 25675 1727204019.87640: variable 'network_connections' from source: play vars 25675 1727204019.87651: variable 'profile' from source: play vars 25675 1727204019.87718: variable 'profile' from source: play vars 25675 1727204019.87722: variable 'interface' from source: set_fact 25675 1727204019.87855: variable 'interface' from source: set_fact 25675 1727204019.87917: variable 'omit' from source: magic vars 25675 1727204019.87925: variable '__lsr_ansible_managed' from source: task vars 25675 1727204019.87985: variable '__lsr_ansible_managed' from source: task vars 25675 1727204019.88248: Loaded config def from plugin (lookup/template) 25675 1727204019.88252: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25675 1727204019.88280: File lookup term: get_ansible_managed.j2 25675 1727204019.88292: variable 'ansible_search_path' from source: unknown 25675 1727204019.88295: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25675 1727204019.88303: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25675 1727204019.88321: variable 'ansible_search_path' from source: unknown 25675 1727204019.99512: variable 'ansible_managed' from source: unknown 25675 1727204019.99654: variable 'omit' from source: magic vars 25675 1727204019.99762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204019.99766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204019.99768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204019.99771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204019.99773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204019.99777: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204019.99780: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.99783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204019.99830: Set connection var ansible_shell_type to sh 25675 1727204019.99835: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204019.99841: Set connection var ansible_timeout to 10 25675 1727204019.99847: Set connection var ansible_pipelining to False 25675 1727204019.99852: Set connection var ansible_shell_executable to /bin/sh 25675 1727204019.99854: Set connection var ansible_connection to ssh 25675 1727204019.99886: variable 'ansible_shell_executable' from source: unknown 25675 1727204019.99890: variable 'ansible_connection' from source: unknown 25675 1727204019.99893: variable 'ansible_module_compression' from source: unknown 25675 1727204019.99895: variable 'ansible_shell_type' from source: unknown 25675 1727204019.99898: variable 'ansible_shell_executable' from source: unknown 25675 1727204019.99900: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204019.99902: variable 'ansible_pipelining' from source: unknown 25675 1727204019.99904: variable 'ansible_timeout' from source: unknown 25675 1727204019.99907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.00022: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204020.00033: variable 'omit' from source: magic vars 25675 1727204020.00036: starting attempt loop 25675 1727204020.00039: running the handler 25675 1727204020.00085: _low_level_execute_command(): starting 25675 1727204020.00091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204020.00693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204020.00704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.00715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.00733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.00792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204020.00795: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204020.00798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.00800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204020.00802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204020.00804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25675 1727204020.00806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.00808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.00816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.00823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204020.00831: stderr chunk (state=3): >>>debug2: match found <<< 25675 1727204020.00841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.00909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.00921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.00949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.01055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.03071: stdout chunk (state=3): >>>/root <<< 25675 1727204020.03075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.03079: stdout chunk (state=3): >>><<< 25675 1727204020.03082: stderr chunk (state=3): >>><<< 25675 1727204020.03084: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.03087: _low_level_execute_command(): starting 25675 1727204020.03090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933 `" && echo ansible-tmp-1727204020.0299237-28877-204995226496933="` echo /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933 `" ) && sleep 0' 25675 1727204020.03673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.03684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204020.03686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204020.03689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.03691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.03693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.03733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.03802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.05716: stdout chunk (state=3): >>>ansible-tmp-1727204020.0299237-28877-204995226496933=/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933 <<< 25675 1727204020.05879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.05884: stdout chunk (state=3): >>><<< 25675 1727204020.05887: stderr chunk (state=3): >>><<< 25675 1727204020.06040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204020.0299237-28877-204995226496933=/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.06044: variable 'ansible_module_compression' from source: unknown 25675 1727204020.06046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 25675 1727204020.06073: variable 'ansible_facts' from source: unknown 25675 1727204020.06233: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py 25675 1727204020.06405: Sending initial data 25675 1727204020.06414: Sent initial data (168 bytes) 25675 1727204020.07243: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204020.07499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.07502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.07505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204020.07507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.07510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.07553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.07565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.07573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.07682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.09271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25675 1727204020.09279: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204020.09354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204020.09432: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpegssclgk /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py <<< 25675 1727204020.09435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py" <<< 25675 1727204020.09498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpegssclgk" to remote "/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py" <<< 25675 1727204020.10899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.10904: stdout chunk (state=3): >>><<< 25675 1727204020.10906: stderr chunk (state=3): >>><<< 25675 1727204020.10909: done transferring module to remote 25675 1727204020.10911: _low_level_execute_command(): starting 25675 1727204020.10913: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/ /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py && sleep 0' 25675 1727204020.11670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204020.11771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.11790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.11924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.12147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.12245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.14092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.14095: stdout chunk (state=3): >>><<< 25675 1727204020.14097: stderr chunk (state=3): >>><<< 25675 1727204020.14117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.14183: _low_level_execute_command(): starting 25675 1727204020.14186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/AnsiballZ_network_connections.py && sleep 0' 25675 1727204020.14830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204020.14850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.14865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.14903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.14925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204020.14957: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727204020.15046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.15083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.15202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.41778: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 25675 1727204020.41786: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lcuwmqi0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lcuwmqi0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/2337de5b-b8f2-42c8-892f-a64413dea3ee: error=unknown <<< 25675 1727204020.41944: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25675 1727204020.43971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204020.43977: stdout chunk (state=3): >>><<< 25675 1727204020.43980: stderr chunk (state=3): >>><<< 25675 1727204020.43982: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lcuwmqi0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lcuwmqi0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/2337de5b-b8f2-42c8-892f-a64413dea3ee: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204020.44019: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204020.44083: _low_level_execute_command(): starting 25675 1727204020.44087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204020.0299237-28877-204995226496933/ > /dev/null 2>&1 && sleep 0' 25675 1727204020.45300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204020.45384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.45387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.45390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.45392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204020.45394: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204020.45396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.45609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.45653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.45657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.45732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.47804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.47807: stdout chunk (state=3): >>><<< 25675 1727204020.47810: stderr chunk (state=3): >>><<< 25675 1727204020.47816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.47818: handler run complete 25675 1727204020.47820: attempt loop complete, returning result 25675 1727204020.47822: _execute() done 25675 1727204020.47824: dumping result to json 25675 1727204020.47826: done dumping result, returning 25675 1727204020.47828: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-41bd-b19d-00000000006c] 25675 1727204020.47829: sending task result for task 028d2410-947f-41bd-b19d-00000000006c changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 25675 1727204020.47997: no more pending results, returning what we have 25675 1727204020.48000: results queue empty 25675 1727204020.48001: checking for any_errors_fatal 25675 1727204020.48008: done checking for any_errors_fatal 25675 1727204020.48008: checking for max_fail_percentage 25675 1727204020.48010: done checking for max_fail_percentage 25675 1727204020.48011: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.48011: done checking to see if all hosts have failed 25675 1727204020.48012: getting the remaining hosts for this loop 25675 1727204020.48013: done getting the remaining hosts for this loop 25675 1727204020.48024: getting the next task for host managed-node2 25675 1727204020.48029: done getting next task for host managed-node2 25675 1727204020.48032: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204020.48034: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.48040: done sending task result for task 028d2410-947f-41bd-b19d-00000000006c 25675 1727204020.48043: WORKER PROCESS EXITING 25675 1727204020.48049: getting variables 25675 1727204020.48050: in VariableManager get_vars() 25675 1727204020.48086: Calling all_inventory to load vars for managed-node2 25675 1727204020.48095: Calling groups_inventory to load vars for managed-node2 25675 1727204020.48099: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.48109: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.48112: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.48115: Calling groups_plugins_play to load vars for managed-node2 25675 1727204020.49484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204020.50353: done with get_vars() 25675 1727204020.50370: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:40 -0400 (0:00:00.675) 0:00:39.955 ***** 25675 1727204020.50432: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204020.50679: worker is 1 (out of 1 available) 25675 1727204020.50692: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 25675 1727204020.50705: done queuing things up, now waiting for results queue to drain 25675 1727204020.50706: waiting for pending results... 25675 1727204020.50873: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 25675 1727204020.50957: in run() - task 028d2410-947f-41bd-b19d-00000000006d 25675 1727204020.51043: variable 'ansible_search_path' from source: unknown 25675 1727204020.51047: variable 'ansible_search_path' from source: unknown 25675 1727204020.51050: calling self._execute() 25675 1727204020.51191: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.51195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.51199: variable 'omit' from source: magic vars 25675 1727204020.51530: variable 'ansible_distribution_major_version' from source: facts 25675 1727204020.51547: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204020.51678: variable 'network_state' from source: role '' defaults 25675 1727204020.51700: Evaluated conditional (network_state != {}): False 25675 1727204020.51732: when evaluation is False, skipping this task 25675 1727204020.51735: _execute() done 25675 1727204020.51738: dumping result to json 25675 1727204020.51740: done dumping result, returning 25675 1727204020.51783: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-41bd-b19d-00000000006d] 25675 1727204020.51786: sending task result for task 028d2410-947f-41bd-b19d-00000000006d 25675 1727204020.51919: done sending task result for task 028d2410-947f-41bd-b19d-00000000006d 25675 1727204020.51923: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25675 1727204020.51983: no more pending results, returning what we have 25675 1727204020.51988: results queue empty 25675 1727204020.51988: checking for any_errors_fatal 25675 1727204020.51997: done checking for any_errors_fatal 25675 1727204020.51998: checking for max_fail_percentage 25675 1727204020.52000: done checking for max_fail_percentage 25675 1727204020.52001: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.52002: done checking to see if all hosts have failed 25675 1727204020.52003: getting the remaining hosts for this loop 25675 1727204020.52004: done getting the remaining hosts for this loop 25675 1727204020.52007: getting the next task for host managed-node2 25675 1727204020.52013: done getting next task for host managed-node2 25675 1727204020.52017: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204020.52020: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.52034: getting variables 25675 1727204020.52035: in VariableManager get_vars() 25675 1727204020.52072: Calling all_inventory to load vars for managed-node2 25675 1727204020.52076: Calling groups_inventory to load vars for managed-node2 25675 1727204020.52079: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.52089: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.52092: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.52096: Calling groups_plugins_play to load vars for managed-node2 25675 1727204020.56220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204020.57092: done with get_vars() 25675 1727204020.57108: done getting variables 25675 1727204020.57142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:40 -0400 (0:00:00.067) 0:00:40.022 ***** 25675 1727204020.57160: entering _queue_task() for managed-node2/debug 25675 1727204020.57432: worker is 1 (out of 1 available) 25675 1727204020.57446: exiting _queue_task() for managed-node2/debug 25675 1727204020.57459: done queuing things up, now waiting for results queue to drain 25675 1727204020.57461: waiting for pending results... 25675 1727204020.57644: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25675 1727204020.57728: in run() - task 028d2410-947f-41bd-b19d-00000000006e 25675 1727204020.57741: variable 'ansible_search_path' from source: unknown 25675 1727204020.57746: variable 'ansible_search_path' from source: unknown 25675 1727204020.57772: calling self._execute() 25675 1727204020.57848: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.57854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.57862: variable 'omit' from source: magic vars 25675 1727204020.58153: variable 'ansible_distribution_major_version' from source: facts 25675 1727204020.58163: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204020.58169: variable 'omit' from source: magic vars 25675 1727204020.58199: variable 'omit' from source: magic vars 25675 1727204020.58224: variable 'omit' from source: magic vars 25675 1727204020.58259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204020.58290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204020.58307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204020.58321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.58330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.58353: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204020.58358: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.58361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.58430: Set connection var ansible_shell_type to sh 25675 1727204020.58434: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204020.58440: Set connection var ansible_timeout to 10 25675 1727204020.58447: Set connection var ansible_pipelining to False 25675 1727204020.58450: Set connection var ansible_shell_executable to /bin/sh 25675 1727204020.58452: Set connection var ansible_connection to ssh 25675 1727204020.58479: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.58483: variable 'ansible_connection' from source: unknown 25675 1727204020.58485: variable 'ansible_module_compression' from source: unknown 25675 1727204020.58488: variable 'ansible_shell_type' from source: unknown 25675 1727204020.58490: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.58492: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.58494: variable 'ansible_pipelining' from source: unknown 25675 1727204020.58496: variable 'ansible_timeout' from source: unknown 25675 1727204020.58498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.58600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204020.58609: variable 'omit' from source: magic vars 25675 1727204020.58614: starting attempt loop 25675 1727204020.58617: running the handler 25675 1727204020.58715: variable '__network_connections_result' from source: set_fact 25675 1727204020.58752: handler run complete 25675 1727204020.58764: attempt loop complete, returning result 25675 1727204020.58768: _execute() done 25675 1727204020.58772: dumping result to json 25675 1727204020.58775: done dumping result, returning 25675 1727204020.58785: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-41bd-b19d-00000000006e] 25675 1727204020.58789: sending task result for task 028d2410-947f-41bd-b19d-00000000006e 25675 1727204020.58872: done sending task result for task 028d2410-947f-41bd-b19d-00000000006e 25675 1727204020.58875: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 25675 1727204020.58939: no more pending results, returning what we have 25675 1727204020.58942: results queue empty 25675 1727204020.58942: checking for any_errors_fatal 25675 1727204020.58949: done checking for any_errors_fatal 25675 1727204020.58950: checking for max_fail_percentage 25675 1727204020.58951: done checking for max_fail_percentage 25675 1727204020.58952: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.58953: done checking to see if all hosts have failed 25675 1727204020.58953: getting the remaining hosts for this loop 25675 1727204020.58955: done getting the remaining hosts for this loop 25675 1727204020.58958: getting the next task for host managed-node2 25675 1727204020.58963: done getting next task for host managed-node2 25675 1727204020.58967: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204020.58969: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.58979: getting variables 25675 1727204020.58981: in VariableManager get_vars() 25675 1727204020.59013: Calling all_inventory to load vars for managed-node2 25675 1727204020.59015: Calling groups_inventory to load vars for managed-node2 25675 1727204020.59017: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.59026: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.59029: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.59031: Calling groups_plugins_play to load vars for managed-node2 25675 1727204020.59812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204020.60693: done with get_vars() 25675 1727204020.60710: done getting variables 25675 1727204020.60748: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:40 -0400 (0:00:00.036) 0:00:40.058 ***** 25675 1727204020.60768: entering _queue_task() for managed-node2/debug 25675 1727204020.60993: worker is 1 (out of 1 available) 25675 1727204020.61008: exiting _queue_task() for managed-node2/debug 25675 1727204020.61020: done queuing things up, now waiting for results queue to drain 25675 1727204020.61021: waiting for pending results... 25675 1727204020.61189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25675 1727204020.61260: in run() - task 028d2410-947f-41bd-b19d-00000000006f 25675 1727204020.61272: variable 'ansible_search_path' from source: unknown 25675 1727204020.61277: variable 'ansible_search_path' from source: unknown 25675 1727204020.61307: calling self._execute() 25675 1727204020.61379: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.61389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.61397: variable 'omit' from source: magic vars 25675 1727204020.61665: variable 'ansible_distribution_major_version' from source: facts 25675 1727204020.61674: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204020.61685: variable 'omit' from source: magic vars 25675 1727204020.61715: variable 'omit' from source: magic vars 25675 1727204020.61739: variable 'omit' from source: magic vars 25675 1727204020.61769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204020.61801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204020.61818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204020.61831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.61841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.61867: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204020.61871: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.61873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.61946: Set connection var ansible_shell_type to sh 25675 1727204020.61949: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204020.61955: Set connection var ansible_timeout to 10 25675 1727204020.61960: Set connection var ansible_pipelining to False 25675 1727204020.61965: Set connection var ansible_shell_executable to /bin/sh 25675 1727204020.61967: Set connection var ansible_connection to ssh 25675 1727204020.61990: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.61993: variable 'ansible_connection' from source: unknown 25675 1727204020.61996: variable 'ansible_module_compression' from source: unknown 25675 1727204020.61998: variable 'ansible_shell_type' from source: unknown 25675 1727204020.62000: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.62004: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.62007: variable 'ansible_pipelining' from source: unknown 25675 1727204020.62009: variable 'ansible_timeout' from source: unknown 25675 1727204020.62012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.62110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204020.62119: variable 'omit' from source: magic vars 25675 1727204020.62126: starting attempt loop 25675 1727204020.62129: running the handler 25675 1727204020.62166: variable '__network_connections_result' from source: set_fact 25675 1727204020.62219: variable '__network_connections_result' from source: set_fact 25675 1727204020.62292: handler run complete 25675 1727204020.62308: attempt loop complete, returning result 25675 1727204020.62311: _execute() done 25675 1727204020.62313: dumping result to json 25675 1727204020.62318: done dumping result, returning 25675 1727204020.62325: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-41bd-b19d-00000000006f] 25675 1727204020.62327: sending task result for task 028d2410-947f-41bd-b19d-00000000006f 25675 1727204020.62416: done sending task result for task 028d2410-947f-41bd-b19d-00000000006f 25675 1727204020.62418: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 25675 1727204020.62499: no more pending results, returning what we have 25675 1727204020.62502: results queue empty 25675 1727204020.62503: checking for any_errors_fatal 25675 1727204020.62507: done checking for any_errors_fatal 25675 1727204020.62508: checking for max_fail_percentage 25675 1727204020.62509: done checking for max_fail_percentage 25675 1727204020.62510: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.62511: done checking to see if all hosts have failed 25675 1727204020.62512: getting the remaining hosts for this loop 25675 1727204020.62513: done getting the remaining hosts for this loop 25675 1727204020.62516: getting the next task for host managed-node2 25675 1727204020.62520: done getting next task for host managed-node2 25675 1727204020.62523: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204020.62525: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.62535: getting variables 25675 1727204020.62536: in VariableManager get_vars() 25675 1727204020.62566: Calling all_inventory to load vars for managed-node2 25675 1727204020.62568: Calling groups_inventory to load vars for managed-node2 25675 1727204020.62570: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.62581: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.62584: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.62587: Calling groups_plugins_play to load vars for managed-node2 25675 1727204020.63437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204020.64321: done with get_vars() 25675 1727204020.64336: done getting variables 25675 1727204020.64379: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:40 -0400 (0:00:00.036) 0:00:40.095 ***** 25675 1727204020.64404: entering _queue_task() for managed-node2/debug 25675 1727204020.64613: worker is 1 (out of 1 available) 25675 1727204020.64625: exiting _queue_task() for managed-node2/debug 25675 1727204020.64636: done queuing things up, now waiting for results queue to drain 25675 1727204020.64638: waiting for pending results... 25675 1727204020.64806: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25675 1727204020.64879: in run() - task 028d2410-947f-41bd-b19d-000000000070 25675 1727204020.64889: variable 'ansible_search_path' from source: unknown 25675 1727204020.64892: variable 'ansible_search_path' from source: unknown 25675 1727204020.64920: calling self._execute() 25675 1727204020.64989: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.64994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.65001: variable 'omit' from source: magic vars 25675 1727204020.65263: variable 'ansible_distribution_major_version' from source: facts 25675 1727204020.65272: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204020.65357: variable 'network_state' from source: role '' defaults 25675 1727204020.65364: Evaluated conditional (network_state != {}): False 25675 1727204020.65367: when evaluation is False, skipping this task 25675 1727204020.65370: _execute() done 25675 1727204020.65373: dumping result to json 25675 1727204020.65381: done dumping result, returning 25675 1727204020.65386: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-41bd-b19d-000000000070] 25675 1727204020.65391: sending task result for task 028d2410-947f-41bd-b19d-000000000070 25675 1727204020.65477: done sending task result for task 028d2410-947f-41bd-b19d-000000000070 25675 1727204020.65482: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 25675 1727204020.65553: no more pending results, returning what we have 25675 1727204020.65555: results queue empty 25675 1727204020.65556: checking for any_errors_fatal 25675 1727204020.65562: done checking for any_errors_fatal 25675 1727204020.65562: checking for max_fail_percentage 25675 1727204020.65564: done checking for max_fail_percentage 25675 1727204020.65565: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.65566: done checking to see if all hosts have failed 25675 1727204020.65567: getting the remaining hosts for this loop 25675 1727204020.65567: done getting the remaining hosts for this loop 25675 1727204020.65570: getting the next task for host managed-node2 25675 1727204020.65574: done getting next task for host managed-node2 25675 1727204020.65582: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204020.65584: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.65596: getting variables 25675 1727204020.65597: in VariableManager get_vars() 25675 1727204020.65625: Calling all_inventory to load vars for managed-node2 25675 1727204020.65628: Calling groups_inventory to load vars for managed-node2 25675 1727204020.65630: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.65637: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.65640: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.65643: Calling groups_plugins_play to load vars for managed-node2 25675 1727204020.66374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204020.67348: done with get_vars() 25675 1727204020.67363: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:40 -0400 (0:00:00.030) 0:00:40.125 ***** 25675 1727204020.67429: entering _queue_task() for managed-node2/ping 25675 1727204020.67642: worker is 1 (out of 1 available) 25675 1727204020.67655: exiting _queue_task() for managed-node2/ping 25675 1727204020.67668: done queuing things up, now waiting for results queue to drain 25675 1727204020.67669: waiting for pending results... 25675 1727204020.67834: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25675 1727204020.67904: in run() - task 028d2410-947f-41bd-b19d-000000000071 25675 1727204020.67920: variable 'ansible_search_path' from source: unknown 25675 1727204020.67923: variable 'ansible_search_path' from source: unknown 25675 1727204020.67948: calling self._execute() 25675 1727204020.68017: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.68022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.68032: variable 'omit' from source: magic vars 25675 1727204020.68301: variable 'ansible_distribution_major_version' from source: facts 25675 1727204020.68310: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204020.68316: variable 'omit' from source: magic vars 25675 1727204020.68342: variable 'omit' from source: magic vars 25675 1727204020.68368: variable 'omit' from source: magic vars 25675 1727204020.68403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204020.68428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204020.68446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204020.68461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.68472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204020.68497: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204020.68500: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.68503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.68568: Set connection var ansible_shell_type to sh 25675 1727204020.68573: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204020.68586: Set connection var ansible_timeout to 10 25675 1727204020.68590: Set connection var ansible_pipelining to False 25675 1727204020.68592: Set connection var ansible_shell_executable to /bin/sh 25675 1727204020.68594: Set connection var ansible_connection to ssh 25675 1727204020.68611: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.68614: variable 'ansible_connection' from source: unknown 25675 1727204020.68618: variable 'ansible_module_compression' from source: unknown 25675 1727204020.68620: variable 'ansible_shell_type' from source: unknown 25675 1727204020.68623: variable 'ansible_shell_executable' from source: unknown 25675 1727204020.68625: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204020.68627: variable 'ansible_pipelining' from source: unknown 25675 1727204020.68629: variable 'ansible_timeout' from source: unknown 25675 1727204020.68633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204020.68774: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204020.68785: variable 'omit' from source: magic vars 25675 1727204020.68790: starting attempt loop 25675 1727204020.68793: running the handler 25675 1727204020.68807: _low_level_execute_command(): starting 25675 1727204020.68814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204020.69327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.69331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.69335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727204020.69340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.69393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.69396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.69399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.69485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.71186: stdout chunk (state=3): >>>/root <<< 25675 1727204020.71282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.71314: stderr chunk (state=3): >>><<< 25675 1727204020.71317: stdout chunk (state=3): >>><<< 25675 1727204020.71338: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.71349: _low_level_execute_command(): starting 25675 1727204020.71355: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764 `" && echo ansible-tmp-1727204020.7133806-28913-89733957108764="` echo /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764 `" ) && sleep 0' 25675 1727204020.71797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.71801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.71811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.71813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.71861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.71864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.71869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.71939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.73839: stdout chunk (state=3): >>>ansible-tmp-1727204020.7133806-28913-89733957108764=/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764 <<< 25675 1727204020.73944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.73968: stderr chunk (state=3): >>><<< 25675 1727204020.73972: stdout chunk (state=3): >>><<< 25675 1727204020.73992: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204020.7133806-28913-89733957108764=/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.74028: variable 'ansible_module_compression' from source: unknown 25675 1727204020.74059: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 25675 1727204020.74094: variable 'ansible_facts' from source: unknown 25675 1727204020.74143: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py 25675 1727204020.74242: Sending initial data 25675 1727204020.74245: Sent initial data (152 bytes) 25675 1727204020.74683: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.74687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.74690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204020.74692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.74736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.74739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.74815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.76387: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25675 1727204020.76395: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204020.76454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204020.76526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp53jm_77f /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py <<< 25675 1727204020.76529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py" <<< 25675 1727204020.76598: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp53jm_77f" to remote "/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py" <<< 25675 1727204020.76601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py" <<< 25675 1727204020.77244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.77263: stderr chunk (state=3): >>><<< 25675 1727204020.77267: stdout chunk (state=3): >>><<< 25675 1727204020.77288: done transferring module to remote 25675 1727204020.77297: _low_level_execute_command(): starting 25675 1727204020.77301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/ /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py && sleep 0' 25675 1727204020.77744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.77747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204020.77750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.77752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.77758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.77811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.77818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.77820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.77892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.79660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.79688: stderr chunk (state=3): >>><<< 25675 1727204020.79691: stdout chunk (state=3): >>><<< 25675 1727204020.79706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.79709: _low_level_execute_command(): starting 25675 1727204020.79714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/AnsiballZ_ping.py && sleep 0' 25675 1727204020.80159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.80162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204020.80164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.80166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.80168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.80213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.80216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.80299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.95095: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25675 1727204020.96381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204020.96408: stderr chunk (state=3): >>><<< 25675 1727204020.96411: stdout chunk (state=3): >>><<< 25675 1727204020.96426: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204020.96446: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204020.96454: _low_level_execute_command(): starting 25675 1727204020.96459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204020.7133806-28913-89733957108764/ > /dev/null 2>&1 && sleep 0' 25675 1727204020.96922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204020.96925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204020.96929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204020.96932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204020.96934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204020.96983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204020.96987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204020.96999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204020.97065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204020.98916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204020.98941: stderr chunk (state=3): >>><<< 25675 1727204020.98945: stdout chunk (state=3): >>><<< 25675 1727204020.98957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204020.98966: handler run complete 25675 1727204020.98983: attempt loop complete, returning result 25675 1727204020.98986: _execute() done 25675 1727204020.98988: dumping result to json 25675 1727204020.98990: done dumping result, returning 25675 1727204020.98997: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-41bd-b19d-000000000071] 25675 1727204020.99002: sending task result for task 028d2410-947f-41bd-b19d-000000000071 25675 1727204020.99094: done sending task result for task 028d2410-947f-41bd-b19d-000000000071 25675 1727204020.99097: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 25675 1727204020.99149: no more pending results, returning what we have 25675 1727204020.99151: results queue empty 25675 1727204020.99152: checking for any_errors_fatal 25675 1727204020.99160: done checking for any_errors_fatal 25675 1727204020.99160: checking for max_fail_percentage 25675 1727204020.99162: done checking for max_fail_percentage 25675 1727204020.99163: checking to see if all hosts have failed and the running result is not ok 25675 1727204020.99164: done checking to see if all hosts have failed 25675 1727204020.99164: getting the remaining hosts for this loop 25675 1727204020.99166: done getting the remaining hosts for this loop 25675 1727204020.99169: getting the next task for host managed-node2 25675 1727204020.99181: done getting next task for host managed-node2 25675 1727204020.99183: ^ task is: TASK: meta (role_complete) 25675 1727204020.99185: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204020.99194: getting variables 25675 1727204020.99196: in VariableManager get_vars() 25675 1727204020.99233: Calling all_inventory to load vars for managed-node2 25675 1727204020.99235: Calling groups_inventory to load vars for managed-node2 25675 1727204020.99237: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204020.99246: Calling all_plugins_play to load vars for managed-node2 25675 1727204020.99249: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204020.99251: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.00085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.00970: done with get_vars() 25675 1727204021.00991: done getting variables 25675 1727204021.01047: done queuing things up, now waiting for results queue to drain 25675 1727204021.01049: results queue empty 25675 1727204021.01049: checking for any_errors_fatal 25675 1727204021.01051: done checking for any_errors_fatal 25675 1727204021.01051: checking for max_fail_percentage 25675 1727204021.01052: done checking for max_fail_percentage 25675 1727204021.01053: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.01053: done checking to see if all hosts have failed 25675 1727204021.01054: getting the remaining hosts for this loop 25675 1727204021.01054: done getting the remaining hosts for this loop 25675 1727204021.01056: getting the next task for host managed-node2 25675 1727204021.01058: done getting next task for host managed-node2 25675 1727204021.01059: ^ task is: TASK: meta (flush_handlers) 25675 1727204021.01060: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.01062: getting variables 25675 1727204021.01063: in VariableManager get_vars() 25675 1727204021.01072: Calling all_inventory to load vars for managed-node2 25675 1727204021.01074: Calling groups_inventory to load vars for managed-node2 25675 1727204021.01076: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.01082: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.01083: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.01085: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.01822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.02718: done with get_vars() 25675 1727204021.02732: done getting variables 25675 1727204021.02767: in VariableManager get_vars() 25675 1727204021.02780: Calling all_inventory to load vars for managed-node2 25675 1727204021.02782: Calling groups_inventory to load vars for managed-node2 25675 1727204021.02783: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.02786: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.02788: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.02789: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.03433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.04306: done with get_vars() 25675 1727204021.04323: done queuing things up, now waiting for results queue to drain 25675 1727204021.04325: results queue empty 25675 1727204021.04325: checking for any_errors_fatal 25675 1727204021.04326: done checking for any_errors_fatal 25675 1727204021.04327: checking for max_fail_percentage 25675 1727204021.04327: done checking for max_fail_percentage 25675 1727204021.04328: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.04328: done checking to see if all hosts have failed 25675 1727204021.04329: getting the remaining hosts for this loop 25675 1727204021.04329: done getting the remaining hosts for this loop 25675 1727204021.04331: getting the next task for host managed-node2 25675 1727204021.04334: done getting next task for host managed-node2 25675 1727204021.04335: ^ task is: TASK: meta (flush_handlers) 25675 1727204021.04336: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.04342: getting variables 25675 1727204021.04343: in VariableManager get_vars() 25675 1727204021.04350: Calling all_inventory to load vars for managed-node2 25675 1727204021.04351: Calling groups_inventory to load vars for managed-node2 25675 1727204021.04352: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.04356: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.04357: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.04359: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.05030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.05882: done with get_vars() 25675 1727204021.05899: done getting variables 25675 1727204021.05931: in VariableManager get_vars() 25675 1727204021.05939: Calling all_inventory to load vars for managed-node2 25675 1727204021.05940: Calling groups_inventory to load vars for managed-node2 25675 1727204021.05942: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.05945: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.05946: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.05948: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.06579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.07518: done with get_vars() 25675 1727204021.07534: done queuing things up, now waiting for results queue to drain 25675 1727204021.07536: results queue empty 25675 1727204021.07536: checking for any_errors_fatal 25675 1727204021.07537: done checking for any_errors_fatal 25675 1727204021.07538: checking for max_fail_percentage 25675 1727204021.07538: done checking for max_fail_percentage 25675 1727204021.07539: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.07539: done checking to see if all hosts have failed 25675 1727204021.07540: getting the remaining hosts for this loop 25675 1727204021.07540: done getting the remaining hosts for this loop 25675 1727204021.07542: getting the next task for host managed-node2 25675 1727204021.07544: done getting next task for host managed-node2 25675 1727204021.07545: ^ task is: None 25675 1727204021.07546: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.07546: done queuing things up, now waiting for results queue to drain 25675 1727204021.07547: results queue empty 25675 1727204021.07547: checking for any_errors_fatal 25675 1727204021.07548: done checking for any_errors_fatal 25675 1727204021.07548: checking for max_fail_percentage 25675 1727204021.07549: done checking for max_fail_percentage 25675 1727204021.07549: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.07550: done checking to see if all hosts have failed 25675 1727204021.07550: getting the next task for host managed-node2 25675 1727204021.07552: done getting next task for host managed-node2 25675 1727204021.07552: ^ task is: None 25675 1727204021.07553: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.07593: in VariableManager get_vars() 25675 1727204021.07604: done with get_vars() 25675 1727204021.07608: in VariableManager get_vars() 25675 1727204021.07613: done with get_vars() 25675 1727204021.07616: variable 'omit' from source: magic vars 25675 1727204021.07637: in VariableManager get_vars() 25675 1727204021.07644: done with get_vars() 25675 1727204021.07658: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 25675 1727204021.07819: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204021.07843: getting the remaining hosts for this loop 25675 1727204021.07844: done getting the remaining hosts for this loop 25675 1727204021.07846: getting the next task for host managed-node2 25675 1727204021.07848: done getting next task for host managed-node2 25675 1727204021.07849: ^ task is: TASK: Gathering Facts 25675 1727204021.07850: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.07851: getting variables 25675 1727204021.07852: in VariableManager get_vars() 25675 1727204021.07857: Calling all_inventory to load vars for managed-node2 25675 1727204021.07858: Calling groups_inventory to load vars for managed-node2 25675 1727204021.07860: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.07863: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.07864: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.07866: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.08516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.09365: done with get_vars() 25675 1727204021.09380: done getting variables 25675 1727204021.09408: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Tuesday 24 September 2024 14:53:41 -0400 (0:00:00.419) 0:00:40.545 ***** 25675 1727204021.09425: entering _queue_task() for managed-node2/gather_facts 25675 1727204021.09671: worker is 1 (out of 1 available) 25675 1727204021.09683: exiting _queue_task() for managed-node2/gather_facts 25675 1727204021.09694: done queuing things up, now waiting for results queue to drain 25675 1727204021.09695: waiting for pending results... 25675 1727204021.09870: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204021.09933: in run() - task 028d2410-947f-41bd-b19d-0000000004e4 25675 1727204021.09946: variable 'ansible_search_path' from source: unknown 25675 1727204021.09975: calling self._execute() 25675 1727204021.10045: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204021.10051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204021.10059: variable 'omit' from source: magic vars 25675 1727204021.10339: variable 'ansible_distribution_major_version' from source: facts 25675 1727204021.10349: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204021.10355: variable 'omit' from source: magic vars 25675 1727204021.10379: variable 'omit' from source: magic vars 25675 1727204021.10406: variable 'omit' from source: magic vars 25675 1727204021.10437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204021.10466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204021.10486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204021.10500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204021.10510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204021.10532: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204021.10536: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204021.10538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204021.10611: Set connection var ansible_shell_type to sh 25675 1727204021.10614: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204021.10620: Set connection var ansible_timeout to 10 25675 1727204021.10625: Set connection var ansible_pipelining to False 25675 1727204021.10630: Set connection var ansible_shell_executable to /bin/sh 25675 1727204021.10632: Set connection var ansible_connection to ssh 25675 1727204021.10652: variable 'ansible_shell_executable' from source: unknown 25675 1727204021.10655: variable 'ansible_connection' from source: unknown 25675 1727204021.10658: variable 'ansible_module_compression' from source: unknown 25675 1727204021.10661: variable 'ansible_shell_type' from source: unknown 25675 1727204021.10664: variable 'ansible_shell_executable' from source: unknown 25675 1727204021.10666: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204021.10668: variable 'ansible_pipelining' from source: unknown 25675 1727204021.10671: variable 'ansible_timeout' from source: unknown 25675 1727204021.10674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204021.10812: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204021.10822: variable 'omit' from source: magic vars 25675 1727204021.10826: starting attempt loop 25675 1727204021.10828: running the handler 25675 1727204021.10842: variable 'ansible_facts' from source: unknown 25675 1727204021.10856: _low_level_execute_command(): starting 25675 1727204021.10863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204021.11354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.11392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.11396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204021.11400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.11447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.11450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204021.11452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.11537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.13250: stdout chunk (state=3): >>>/root <<< 25675 1727204021.13348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204021.13382: stderr chunk (state=3): >>><<< 25675 1727204021.13386: stdout chunk (state=3): >>><<< 25675 1727204021.13411: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204021.13425: _low_level_execute_command(): starting 25675 1727204021.13430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681 `" && echo ansible-tmp-1727204021.1341212-28922-22144465654681="` echo /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681 `" ) && sleep 0' 25675 1727204021.13884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204021.13887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204021.13889: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.13900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.13902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.13952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.13960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204021.13963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.14028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.15964: stdout chunk (state=3): >>>ansible-tmp-1727204021.1341212-28922-22144465654681=/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681 <<< 25675 1727204021.16069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204021.16101: stderr chunk (state=3): >>><<< 25675 1727204021.16104: stdout chunk (state=3): >>><<< 25675 1727204021.16123: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204021.1341212-28922-22144465654681=/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204021.16146: variable 'ansible_module_compression' from source: unknown 25675 1727204021.16188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204021.16243: variable 'ansible_facts' from source: unknown 25675 1727204021.16380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py 25675 1727204021.16482: Sending initial data 25675 1727204021.16485: Sent initial data (153 bytes) 25675 1727204021.16928: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204021.16931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204021.16933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.16935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.16938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.16983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.17003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.17068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.18647: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204021.18720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204021.18790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp3zct06bp /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py <<< 25675 1727204021.18793: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py" <<< 25675 1727204021.18863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp3zct06bp" to remote "/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py" <<< 25675 1727204021.18867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py" <<< 25675 1727204021.20050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204021.20092: stderr chunk (state=3): >>><<< 25675 1727204021.20095: stdout chunk (state=3): >>><<< 25675 1727204021.20111: done transferring module to remote 25675 1727204021.20120: _low_level_execute_command(): starting 25675 1727204021.20125: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/ /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py && sleep 0' 25675 1727204021.20570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204021.20573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204021.20581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.20583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.20589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.20631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.20634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.20713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.22493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204021.22516: stderr chunk (state=3): >>><<< 25675 1727204021.22519: stdout chunk (state=3): >>><<< 25675 1727204021.22536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204021.22539: _low_level_execute_command(): starting 25675 1727204021.22542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/AnsiballZ_setup.py && sleep 0' 25675 1727204021.22970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.22973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204021.22976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204021.22981: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.22983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.23028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.23031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.23113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.86674: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY"<<< 25675 1727204021.86701: stdout chunk (state=3): >>>: "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.43359375, "15m": 0.2333984375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 607, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785747456, "block_size": 4096, "block_total": 65519099, "block_available": 63912536, "block_used": 1606563, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "41", "epoch": "1727204021", "epoch_int": "1727204021", "date": "2024-09-24", "time": "14:53:41", "iso8601_micro": "2024-09-24T18:53:41.827718Z", "iso8601": "2024-09-24T18:53:41Z", "iso8601_basic": "20240924T145341827718", "iso8601_basic_short": "20240924T145341", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.<<< 25675 1727204021.86722: stdout chunk (state=3): >>>0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204021.88682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204021.88713: stderr chunk (state=3): >>><<< 25675 1727204021.88716: stdout chunk (state=3): >>><<< 25675 1727204021.88752: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.43359375, "15m": 0.2333984375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 607, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785747456, "block_size": 4096, "block_total": 65519099, "block_available": 63912536, "block_used": 1606563, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "41", "epoch": "1727204021", "epoch_int": "1727204021", "date": "2024-09-24", "time": "14:53:41", "iso8601_micro": "2024-09-24T18:53:41.827718Z", "iso8601": "2024-09-24T18:53:41Z", "iso8601_basic": "20240924T145341827718", "iso8601_basic_short": "20240924T145341", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204021.88977: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204021.89001: _low_level_execute_command(): starting 25675 1727204021.89004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204021.1341212-28922-22144465654681/ > /dev/null 2>&1 && sleep 0' 25675 1727204021.89466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.89469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204021.89472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.89475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204021.89478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204021.89533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204021.89536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204021.89540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204021.89612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204021.91452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204021.91456: stdout chunk (state=3): >>><<< 25675 1727204021.91682: stderr chunk (state=3): >>><<< 25675 1727204021.91686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204021.91692: handler run complete 25675 1727204021.91695: variable 'ansible_facts' from source: unknown 25675 1727204021.91708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.91995: variable 'ansible_facts' from source: unknown 25675 1727204021.92083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.92206: attempt loop complete, returning result 25675 1727204021.92214: _execute() done 25675 1727204021.92219: dumping result to json 25675 1727204021.92250: done dumping result, returning 25675 1727204021.92264: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-0000000004e4] 25675 1727204021.92271: sending task result for task 028d2410-947f-41bd-b19d-0000000004e4 ok: [managed-node2] 25675 1727204021.92966: no more pending results, returning what we have 25675 1727204021.92969: results queue empty 25675 1727204021.92970: checking for any_errors_fatal 25675 1727204021.92971: done checking for any_errors_fatal 25675 1727204021.92971: checking for max_fail_percentage 25675 1727204021.92973: done checking for max_fail_percentage 25675 1727204021.92974: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.92974: done checking to see if all hosts have failed 25675 1727204021.92977: getting the remaining hosts for this loop 25675 1727204021.92980: done getting the remaining hosts for this loop 25675 1727204021.92983: getting the next task for host managed-node2 25675 1727204021.92988: done getting next task for host managed-node2 25675 1727204021.92989: ^ task is: TASK: meta (flush_handlers) 25675 1727204021.92991: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.92995: getting variables 25675 1727204021.92996: in VariableManager get_vars() 25675 1727204021.93016: Calling all_inventory to load vars for managed-node2 25675 1727204021.93018: Calling groups_inventory to load vars for managed-node2 25675 1727204021.93020: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.93031: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.93034: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.93037: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.93858: done sending task result for task 028d2410-947f-41bd-b19d-0000000004e4 25675 1727204021.93862: WORKER PROCESS EXITING 25675 1727204021.93884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.94745: done with get_vars() 25675 1727204021.94760: done getting variables 25675 1727204021.94812: in VariableManager get_vars() 25675 1727204021.94818: Calling all_inventory to load vars for managed-node2 25675 1727204021.94820: Calling groups_inventory to load vars for managed-node2 25675 1727204021.94822: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.94826: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.94828: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.94829: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.95453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.96391: done with get_vars() 25675 1727204021.96408: done queuing things up, now waiting for results queue to drain 25675 1727204021.96409: results queue empty 25675 1727204021.96410: checking for any_errors_fatal 25675 1727204021.96412: done checking for any_errors_fatal 25675 1727204021.96416: checking for max_fail_percentage 25675 1727204021.96417: done checking for max_fail_percentage 25675 1727204021.96417: checking to see if all hosts have failed and the running result is not ok 25675 1727204021.96418: done checking to see if all hosts have failed 25675 1727204021.96418: getting the remaining hosts for this loop 25675 1727204021.96419: done getting the remaining hosts for this loop 25675 1727204021.96421: getting the next task for host managed-node2 25675 1727204021.96423: done getting next task for host managed-node2 25675 1727204021.96425: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 25675 1727204021.96426: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204021.96427: getting variables 25675 1727204021.96428: in VariableManager get_vars() 25675 1727204021.96433: Calling all_inventory to load vars for managed-node2 25675 1727204021.96434: Calling groups_inventory to load vars for managed-node2 25675 1727204021.96435: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.96439: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.96440: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.96442: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.97071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204021.97918: done with get_vars() 25675 1727204021.97931: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Tuesday 24 September 2024 14:53:41 -0400 (0:00:00.885) 0:00:41.431 ***** 25675 1727204021.97984: entering _queue_task() for managed-node2/include_tasks 25675 1727204021.98229: worker is 1 (out of 1 available) 25675 1727204021.98241: exiting _queue_task() for managed-node2/include_tasks 25675 1727204021.98253: done queuing things up, now waiting for results queue to drain 25675 1727204021.98255: waiting for pending results... 25675 1727204021.98425: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' 25675 1727204021.98496: in run() - task 028d2410-947f-41bd-b19d-000000000074 25675 1727204021.98508: variable 'ansible_search_path' from source: unknown 25675 1727204021.98536: calling self._execute() 25675 1727204021.98604: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204021.98610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204021.98617: variable 'omit' from source: magic vars 25675 1727204021.98884: variable 'ansible_distribution_major_version' from source: facts 25675 1727204021.98894: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204021.98899: _execute() done 25675 1727204021.98902: dumping result to json 25675 1727204021.98907: done dumping result, returning 25675 1727204021.98914: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' [028d2410-947f-41bd-b19d-000000000074] 25675 1727204021.98917: sending task result for task 028d2410-947f-41bd-b19d-000000000074 25675 1727204021.99008: done sending task result for task 028d2410-947f-41bd-b19d-000000000074 25675 1727204021.99011: WORKER PROCESS EXITING 25675 1727204021.99053: no more pending results, returning what we have 25675 1727204021.99057: in VariableManager get_vars() 25675 1727204021.99094: Calling all_inventory to load vars for managed-node2 25675 1727204021.99096: Calling groups_inventory to load vars for managed-node2 25675 1727204021.99099: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204021.99111: Calling all_plugins_play to load vars for managed-node2 25675 1727204021.99114: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204021.99117: Calling groups_plugins_play to load vars for managed-node2 25675 1727204021.99961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.00828: done with get_vars() 25675 1727204022.00843: variable 'ansible_search_path' from source: unknown 25675 1727204022.00854: we have included files to process 25675 1727204022.00855: generating all_blocks data 25675 1727204022.00855: done generating all_blocks data 25675 1727204022.00856: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 25675 1727204022.00857: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 25675 1727204022.00858: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 25675 1727204022.00966: in VariableManager get_vars() 25675 1727204022.00981: done with get_vars() 25675 1727204022.01052: done processing included file 25675 1727204022.01053: iterating over new_blocks loaded from include file 25675 1727204022.01055: in VariableManager get_vars() 25675 1727204022.01063: done with get_vars() 25675 1727204022.01064: filtering new block on tags 25675 1727204022.01079: done filtering new block on tags 25675 1727204022.01081: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 25675 1727204022.01084: extending task lists for all hosts with included blocks 25675 1727204022.01112: done extending task lists 25675 1727204022.01113: done processing included files 25675 1727204022.01113: results queue empty 25675 1727204022.01114: checking for any_errors_fatal 25675 1727204022.01114: done checking for any_errors_fatal 25675 1727204022.01115: checking for max_fail_percentage 25675 1727204022.01115: done checking for max_fail_percentage 25675 1727204022.01116: checking to see if all hosts have failed and the running result is not ok 25675 1727204022.01117: done checking to see if all hosts have failed 25675 1727204022.01117: getting the remaining hosts for this loop 25675 1727204022.01118: done getting the remaining hosts for this loop 25675 1727204022.01119: getting the next task for host managed-node2 25675 1727204022.01121: done getting next task for host managed-node2 25675 1727204022.01123: ^ task is: TASK: Include the task 'get_profile_stat.yml' 25675 1727204022.01124: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204022.01126: getting variables 25675 1727204022.01126: in VariableManager get_vars() 25675 1727204022.01132: Calling all_inventory to load vars for managed-node2 25675 1727204022.01133: Calling groups_inventory to load vars for managed-node2 25675 1727204022.01134: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.01138: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.01139: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.01141: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.01868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.02731: done with get_vars() 25675 1727204022.02745: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:53:42 -0400 (0:00:00.048) 0:00:41.479 ***** 25675 1727204022.02796: entering _queue_task() for managed-node2/include_tasks 25675 1727204022.03029: worker is 1 (out of 1 available) 25675 1727204022.03041: exiting _queue_task() for managed-node2/include_tasks 25675 1727204022.03053: done queuing things up, now waiting for results queue to drain 25675 1727204022.03054: waiting for pending results... 25675 1727204022.03394: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 25675 1727204022.03400: in run() - task 028d2410-947f-41bd-b19d-0000000004f5 25675 1727204022.03403: variable 'ansible_search_path' from source: unknown 25675 1727204022.03405: variable 'ansible_search_path' from source: unknown 25675 1727204022.03408: calling self._execute() 25675 1727204022.03497: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.03508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.03528: variable 'omit' from source: magic vars 25675 1727204022.03915: variable 'ansible_distribution_major_version' from source: facts 25675 1727204022.03931: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204022.03941: _execute() done 25675 1727204022.03949: dumping result to json 25675 1727204022.03964: done dumping result, returning 25675 1727204022.03973: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-41bd-b19d-0000000004f5] 25675 1727204022.03988: sending task result for task 028d2410-947f-41bd-b19d-0000000004f5 25675 1727204022.04218: no more pending results, returning what we have 25675 1727204022.04223: in VariableManager get_vars() 25675 1727204022.04259: Calling all_inventory to load vars for managed-node2 25675 1727204022.04262: Calling groups_inventory to load vars for managed-node2 25675 1727204022.04266: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.04286: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.04290: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.04293: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.04890: done sending task result for task 028d2410-947f-41bd-b19d-0000000004f5 25675 1727204022.04894: WORKER PROCESS EXITING 25675 1727204022.05487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.06362: done with get_vars() 25675 1727204022.06379: variable 'ansible_search_path' from source: unknown 25675 1727204022.06380: variable 'ansible_search_path' from source: unknown 25675 1727204022.06403: we have included files to process 25675 1727204022.06404: generating all_blocks data 25675 1727204022.06405: done generating all_blocks data 25675 1727204022.06406: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25675 1727204022.06406: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25675 1727204022.06408: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25675 1727204022.07387: done processing included file 25675 1727204022.07389: iterating over new_blocks loaded from include file 25675 1727204022.07391: in VariableManager get_vars() 25675 1727204022.07403: done with get_vars() 25675 1727204022.07404: filtering new block on tags 25675 1727204022.07426: done filtering new block on tags 25675 1727204022.07429: in VariableManager get_vars() 25675 1727204022.07439: done with get_vars() 25675 1727204022.07440: filtering new block on tags 25675 1727204022.07459: done filtering new block on tags 25675 1727204022.07461: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 25675 1727204022.07466: extending task lists for all hosts with included blocks 25675 1727204022.07562: done extending task lists 25675 1727204022.07564: done processing included files 25675 1727204022.07565: results queue empty 25675 1727204022.07565: checking for any_errors_fatal 25675 1727204022.07569: done checking for any_errors_fatal 25675 1727204022.07570: checking for max_fail_percentage 25675 1727204022.07571: done checking for max_fail_percentage 25675 1727204022.07572: checking to see if all hosts have failed and the running result is not ok 25675 1727204022.07572: done checking to see if all hosts have failed 25675 1727204022.07573: getting the remaining hosts for this loop 25675 1727204022.07574: done getting the remaining hosts for this loop 25675 1727204022.07581: getting the next task for host managed-node2 25675 1727204022.07585: done getting next task for host managed-node2 25675 1727204022.07587: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 25675 1727204022.07590: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204022.07592: getting variables 25675 1727204022.07593: in VariableManager get_vars() 25675 1727204022.07651: Calling all_inventory to load vars for managed-node2 25675 1727204022.07655: Calling groups_inventory to load vars for managed-node2 25675 1727204022.07657: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.07663: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.07666: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.07668: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.08868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.11002: done with get_vars() 25675 1727204022.11025: done getting variables 25675 1727204022.11067: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:53:42 -0400 (0:00:00.083) 0:00:41.562 ***** 25675 1727204022.11103: entering _queue_task() for managed-node2/set_fact 25675 1727204022.11492: worker is 1 (out of 1 available) 25675 1727204022.11506: exiting _queue_task() for managed-node2/set_fact 25675 1727204022.11520: done queuing things up, now waiting for results queue to drain 25675 1727204022.11521: waiting for pending results... 25675 1727204022.12293: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 25675 1727204022.12298: in run() - task 028d2410-947f-41bd-b19d-000000000502 25675 1727204022.12303: variable 'ansible_search_path' from source: unknown 25675 1727204022.12307: variable 'ansible_search_path' from source: unknown 25675 1727204022.12313: calling self._execute() 25675 1727204022.12571: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.12660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.12683: variable 'omit' from source: magic vars 25675 1727204022.13363: variable 'ansible_distribution_major_version' from source: facts 25675 1727204022.13386: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204022.13399: variable 'omit' from source: magic vars 25675 1727204022.13450: variable 'omit' from source: magic vars 25675 1727204022.13494: variable 'omit' from source: magic vars 25675 1727204022.13539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204022.13586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204022.13611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204022.13634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.13648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.13880: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204022.13883: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.13886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.13888: Set connection var ansible_shell_type to sh 25675 1727204022.13890: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204022.13892: Set connection var ansible_timeout to 10 25675 1727204022.13894: Set connection var ansible_pipelining to False 25675 1727204022.13895: Set connection var ansible_shell_executable to /bin/sh 25675 1727204022.13897: Set connection var ansible_connection to ssh 25675 1727204022.13899: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.13901: variable 'ansible_connection' from source: unknown 25675 1727204022.13903: variable 'ansible_module_compression' from source: unknown 25675 1727204022.13904: variable 'ansible_shell_type' from source: unknown 25675 1727204022.13911: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.13913: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.13915: variable 'ansible_pipelining' from source: unknown 25675 1727204022.13917: variable 'ansible_timeout' from source: unknown 25675 1727204022.13919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.14022: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204022.14038: variable 'omit' from source: magic vars 25675 1727204022.14049: starting attempt loop 25675 1727204022.14055: running the handler 25675 1727204022.14070: handler run complete 25675 1727204022.14086: attempt loop complete, returning result 25675 1727204022.14092: _execute() done 25675 1727204022.14098: dumping result to json 25675 1727204022.14105: done dumping result, returning 25675 1727204022.14114: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-41bd-b19d-000000000502] 25675 1727204022.14122: sending task result for task 028d2410-947f-41bd-b19d-000000000502 25675 1727204022.14216: done sending task result for task 028d2410-947f-41bd-b19d-000000000502 25675 1727204022.14223: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 25675 1727204022.14288: no more pending results, returning what we have 25675 1727204022.14292: results queue empty 25675 1727204022.14293: checking for any_errors_fatal 25675 1727204022.14294: done checking for any_errors_fatal 25675 1727204022.14295: checking for max_fail_percentage 25675 1727204022.14296: done checking for max_fail_percentage 25675 1727204022.14297: checking to see if all hosts have failed and the running result is not ok 25675 1727204022.14298: done checking to see if all hosts have failed 25675 1727204022.14298: getting the remaining hosts for this loop 25675 1727204022.14300: done getting the remaining hosts for this loop 25675 1727204022.14304: getting the next task for host managed-node2 25675 1727204022.14311: done getting next task for host managed-node2 25675 1727204022.14314: ^ task is: TASK: Stat profile file 25675 1727204022.14318: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204022.14321: getting variables 25675 1727204022.14323: in VariableManager get_vars() 25675 1727204022.14350: Calling all_inventory to load vars for managed-node2 25675 1727204022.14352: Calling groups_inventory to load vars for managed-node2 25675 1727204022.14355: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.14365: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.14368: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.14370: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.15894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.18029: done with get_vars() 25675 1727204022.18058: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:53:42 -0400 (0:00:00.070) 0:00:41.633 ***** 25675 1727204022.18192: entering _queue_task() for managed-node2/stat 25675 1727204022.19211: worker is 1 (out of 1 available) 25675 1727204022.19221: exiting _queue_task() for managed-node2/stat 25675 1727204022.19232: done queuing things up, now waiting for results queue to drain 25675 1727204022.19233: waiting for pending results... 25675 1727204022.19359: running TaskExecutor() for managed-node2/TASK: Stat profile file 25675 1727204022.19684: in run() - task 028d2410-947f-41bd-b19d-000000000503 25675 1727204022.19688: variable 'ansible_search_path' from source: unknown 25675 1727204022.19690: variable 'ansible_search_path' from source: unknown 25675 1727204022.19693: calling self._execute() 25675 1727204022.19695: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.19698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.19700: variable 'omit' from source: magic vars 25675 1727204022.20087: variable 'ansible_distribution_major_version' from source: facts 25675 1727204022.20116: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204022.20126: variable 'omit' from source: magic vars 25675 1727204022.20158: variable 'omit' from source: magic vars 25675 1727204022.20241: variable 'profile' from source: include params 25675 1727204022.20246: variable 'interface' from source: set_fact 25675 1727204022.20297: variable 'interface' from source: set_fact 25675 1727204022.20311: variable 'omit' from source: magic vars 25675 1727204022.20345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204022.20373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204022.20394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204022.20408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.20418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.20444: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204022.20447: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.20449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.20519: Set connection var ansible_shell_type to sh 25675 1727204022.20523: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204022.20529: Set connection var ansible_timeout to 10 25675 1727204022.20534: Set connection var ansible_pipelining to False 25675 1727204022.20539: Set connection var ansible_shell_executable to /bin/sh 25675 1727204022.20542: Set connection var ansible_connection to ssh 25675 1727204022.20563: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.20566: variable 'ansible_connection' from source: unknown 25675 1727204022.20569: variable 'ansible_module_compression' from source: unknown 25675 1727204022.20573: variable 'ansible_shell_type' from source: unknown 25675 1727204022.20577: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.20582: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.20584: variable 'ansible_pipelining' from source: unknown 25675 1727204022.20587: variable 'ansible_timeout' from source: unknown 25675 1727204022.20589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.20735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204022.20743: variable 'omit' from source: magic vars 25675 1727204022.20750: starting attempt loop 25675 1727204022.20753: running the handler 25675 1727204022.20764: _low_level_execute_command(): starting 25675 1727204022.20772: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204022.21259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.21271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.21295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204022.21299: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.21341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.21352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.21444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.23153: stdout chunk (state=3): >>>/root <<< 25675 1727204022.23252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.23296: stderr chunk (state=3): >>><<< 25675 1727204022.23298: stdout chunk (state=3): >>><<< 25675 1727204022.23312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.23326: _low_level_execute_command(): starting 25675 1727204022.23374: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878 `" && echo ansible-tmp-1727204022.2331705-29031-31504842738878="` echo /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878 `" ) && sleep 0' 25675 1727204022.23799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204022.23804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204022.23815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.23817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204022.23819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.23822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.23864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.23867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.23873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.23946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.25891: stdout chunk (state=3): >>>ansible-tmp-1727204022.2331705-29031-31504842738878=/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878 <<< 25675 1727204022.26040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.26043: stdout chunk (state=3): >>><<< 25675 1727204022.26045: stderr chunk (state=3): >>><<< 25675 1727204022.26060: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204022.2331705-29031-31504842738878=/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.26286: variable 'ansible_module_compression' from source: unknown 25675 1727204022.26289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25675 1727204022.26291: variable 'ansible_facts' from source: unknown 25675 1727204022.26301: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py 25675 1727204022.26502: Sending initial data 25675 1727204022.26505: Sent initial data (152 bytes) 25675 1727204022.26906: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.26919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204022.26932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.26977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.26997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.27068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.28683: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204022.28752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204022.28846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5kr1gd5g /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py <<< 25675 1727204022.28849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py" <<< 25675 1727204022.28934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmp5kr1gd5g" to remote "/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py" <<< 25675 1727204022.29616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.29661: stderr chunk (state=3): >>><<< 25675 1727204022.29667: stdout chunk (state=3): >>><<< 25675 1727204022.29715: done transferring module to remote 25675 1727204022.29730: _low_level_execute_command(): starting 25675 1727204022.29736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/ /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py && sleep 0' 25675 1727204022.30138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.30153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204022.30169: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.30211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.30224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.30298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.32142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.32165: stderr chunk (state=3): >>><<< 25675 1727204022.32168: stdout chunk (state=3): >>><<< 25675 1727204022.32184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.32187: _low_level_execute_command(): starting 25675 1727204022.32192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/AnsiballZ_stat.py && sleep 0' 25675 1727204022.32603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.32606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204022.32608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.32611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.32613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.32660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.32663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.32745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.47938: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25675 1727204022.49448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204022.49453: stdout chunk (state=3): >>><<< 25675 1727204022.49456: stderr chunk (state=3): >>><<< 25675 1727204022.49458: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204022.49461: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204022.49464: _low_level_execute_command(): starting 25675 1727204022.49466: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204022.2331705-29031-31504842738878/ > /dev/null 2>&1 && sleep 0' 25675 1727204022.50086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.50101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25675 1727204022.50112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.50206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.50278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.52229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.52272: stdout chunk (state=3): >>><<< 25675 1727204022.52585: stderr chunk (state=3): >>><<< 25675 1727204022.52590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.52592: handler run complete 25675 1727204022.52595: attempt loop complete, returning result 25675 1727204022.52597: _execute() done 25675 1727204022.52599: dumping result to json 25675 1727204022.52601: done dumping result, returning 25675 1727204022.52603: done running TaskExecutor() for managed-node2/TASK: Stat profile file [028d2410-947f-41bd-b19d-000000000503] 25675 1727204022.52605: sending task result for task 028d2410-947f-41bd-b19d-000000000503 25675 1727204022.52756: done sending task result for task 028d2410-947f-41bd-b19d-000000000503 25675 1727204022.52760: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 25675 1727204022.52824: no more pending results, returning what we have 25675 1727204022.52828: results queue empty 25675 1727204022.52829: checking for any_errors_fatal 25675 1727204022.52838: done checking for any_errors_fatal 25675 1727204022.52839: checking for max_fail_percentage 25675 1727204022.52841: done checking for max_fail_percentage 25675 1727204022.52842: checking to see if all hosts have failed and the running result is not ok 25675 1727204022.52843: done checking to see if all hosts have failed 25675 1727204022.52844: getting the remaining hosts for this loop 25675 1727204022.52845: done getting the remaining hosts for this loop 25675 1727204022.52849: getting the next task for host managed-node2 25675 1727204022.52856: done getting next task for host managed-node2 25675 1727204022.52859: ^ task is: TASK: Set NM profile exist flag based on the profile files 25675 1727204022.52863: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204022.52868: getting variables 25675 1727204022.52870: in VariableManager get_vars() 25675 1727204022.53111: Calling all_inventory to load vars for managed-node2 25675 1727204022.53115: Calling groups_inventory to load vars for managed-node2 25675 1727204022.53119: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.53131: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.53135: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.53138: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.56196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.63211: done with get_vars() 25675 1727204022.63236: done getting variables 25675 1727204022.63290: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:53:42 -0400 (0:00:00.451) 0:00:42.084 ***** 25675 1727204022.63317: entering _queue_task() for managed-node2/set_fact 25675 1727204022.63669: worker is 1 (out of 1 available) 25675 1727204022.63685: exiting _queue_task() for managed-node2/set_fact 25675 1727204022.63697: done queuing things up, now waiting for results queue to drain 25675 1727204022.63699: waiting for pending results... 25675 1727204022.63996: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 25675 1727204022.64139: in run() - task 028d2410-947f-41bd-b19d-000000000504 25675 1727204022.64223: variable 'ansible_search_path' from source: unknown 25675 1727204022.64227: variable 'ansible_search_path' from source: unknown 25675 1727204022.64231: calling self._execute() 25675 1727204022.64305: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.64318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.64338: variable 'omit' from source: magic vars 25675 1727204022.64747: variable 'ansible_distribution_major_version' from source: facts 25675 1727204022.64769: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204022.64905: variable 'profile_stat' from source: set_fact 25675 1727204022.64929: Evaluated conditional (profile_stat.stat.exists): False 25675 1727204022.64937: when evaluation is False, skipping this task 25675 1727204022.64945: _execute() done 25675 1727204022.64984: dumping result to json 25675 1727204022.64987: done dumping result, returning 25675 1727204022.64989: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-41bd-b19d-000000000504] 25675 1727204022.64991: sending task result for task 028d2410-947f-41bd-b19d-000000000504 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25675 1727204022.65249: no more pending results, returning what we have 25675 1727204022.65253: results queue empty 25675 1727204022.65254: checking for any_errors_fatal 25675 1727204022.65266: done checking for any_errors_fatal 25675 1727204022.65267: checking for max_fail_percentage 25675 1727204022.65269: done checking for max_fail_percentage 25675 1727204022.65269: checking to see if all hosts have failed and the running result is not ok 25675 1727204022.65270: done checking to see if all hosts have failed 25675 1727204022.65271: getting the remaining hosts for this loop 25675 1727204022.65273: done getting the remaining hosts for this loop 25675 1727204022.65280: getting the next task for host managed-node2 25675 1727204022.65288: done getting next task for host managed-node2 25675 1727204022.65290: ^ task is: TASK: Get NM profile info 25675 1727204022.65294: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204022.65298: getting variables 25675 1727204022.65301: in VariableManager get_vars() 25675 1727204022.65329: Calling all_inventory to load vars for managed-node2 25675 1727204022.65332: Calling groups_inventory to load vars for managed-node2 25675 1727204022.65336: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204022.65350: Calling all_plugins_play to load vars for managed-node2 25675 1727204022.65354: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204022.65358: Calling groups_plugins_play to load vars for managed-node2 25675 1727204022.65892: done sending task result for task 028d2410-947f-41bd-b19d-000000000504 25675 1727204022.65896: WORKER PROCESS EXITING 25675 1727204022.66928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204022.68490: done with get_vars() 25675 1727204022.68513: done getting variables 25675 1727204022.68609: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:53:42 -0400 (0:00:00.053) 0:00:42.137 ***** 25675 1727204022.68645: entering _queue_task() for managed-node2/shell 25675 1727204022.68647: Creating lock for shell 25675 1727204022.69185: worker is 1 (out of 1 available) 25675 1727204022.69193: exiting _queue_task() for managed-node2/shell 25675 1727204022.69202: done queuing things up, now waiting for results queue to drain 25675 1727204022.69203: waiting for pending results... 25675 1727204022.69261: running TaskExecutor() for managed-node2/TASK: Get NM profile info 25675 1727204022.69407: in run() - task 028d2410-947f-41bd-b19d-000000000505 25675 1727204022.69436: variable 'ansible_search_path' from source: unknown 25675 1727204022.69445: variable 'ansible_search_path' from source: unknown 25675 1727204022.69488: calling self._execute() 25675 1727204022.69574: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.69589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.69601: variable 'omit' from source: magic vars 25675 1727204022.69980: variable 'ansible_distribution_major_version' from source: facts 25675 1727204022.69996: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204022.70005: variable 'omit' from source: magic vars 25675 1727204022.70058: variable 'omit' from source: magic vars 25675 1727204022.70168: variable 'profile' from source: include params 25675 1727204022.70181: variable 'interface' from source: set_fact 25675 1727204022.70258: variable 'interface' from source: set_fact 25675 1727204022.70300: variable 'omit' from source: magic vars 25675 1727204022.70380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204022.70384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204022.70392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204022.70420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.70435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204022.70464: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204022.70471: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.70480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.70582: Set connection var ansible_shell_type to sh 25675 1727204022.70592: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204022.70601: Set connection var ansible_timeout to 10 25675 1727204022.70623: Set connection var ansible_pipelining to False 25675 1727204022.70626: Set connection var ansible_shell_executable to /bin/sh 25675 1727204022.70628: Set connection var ansible_connection to ssh 25675 1727204022.70654: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.70733: variable 'ansible_connection' from source: unknown 25675 1727204022.70736: variable 'ansible_module_compression' from source: unknown 25675 1727204022.70739: variable 'ansible_shell_type' from source: unknown 25675 1727204022.70741: variable 'ansible_shell_executable' from source: unknown 25675 1727204022.70743: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204022.70744: variable 'ansible_pipelining' from source: unknown 25675 1727204022.70747: variable 'ansible_timeout' from source: unknown 25675 1727204022.70748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204022.70829: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204022.70851: variable 'omit' from source: magic vars 25675 1727204022.70860: starting attempt loop 25675 1727204022.70866: running the handler 25675 1727204022.70880: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204022.70901: _low_level_execute_command(): starting 25675 1727204022.70912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204022.71697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.71736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.71753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.71777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.71890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.73627: stdout chunk (state=3): >>>/root <<< 25675 1727204022.73762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.73779: stderr chunk (state=3): >>><<< 25675 1727204022.73793: stdout chunk (state=3): >>><<< 25675 1727204022.73822: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.73842: _low_level_execute_command(): starting 25675 1727204022.73881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105 `" && echo ansible-tmp-1727204022.7383015-29143-28845285855105="` echo /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105 `" ) && sleep 0' 25675 1727204022.74474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204022.74492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204022.74523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.74625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.74671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.74753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.76705: stdout chunk (state=3): >>>ansible-tmp-1727204022.7383015-29143-28845285855105=/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105 <<< 25675 1727204022.76868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.76872: stdout chunk (state=3): >>><<< 25675 1727204022.76874: stderr chunk (state=3): >>><<< 25675 1727204022.77039: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204022.7383015-29143-28845285855105=/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.77043: variable 'ansible_module_compression' from source: unknown 25675 1727204022.77046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727204022.77049: variable 'ansible_facts' from source: unknown 25675 1727204022.77331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py 25675 1727204022.77798: Sending initial data 25675 1727204022.77807: Sent initial data (155 bytes) 25675 1727204022.78493: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204022.78497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.78549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204022.78566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.78590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.78765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.80327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204022.80403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204022.80487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpyxe6jrmo /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py <<< 25675 1727204022.80490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py" <<< 25675 1727204022.80593: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpyxe6jrmo" to remote "/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py" <<< 25675 1727204022.81671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.81706: stderr chunk (state=3): >>><<< 25675 1727204022.81716: stdout chunk (state=3): >>><<< 25675 1727204022.81844: done transferring module to remote 25675 1727204022.81848: _low_level_execute_command(): starting 25675 1727204022.81851: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/ /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py && sleep 0' 25675 1727204022.82367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204022.82389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204022.82404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.82422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204022.82441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204022.82454: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204022.82469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.82504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204022.82591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.82606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.82706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204022.84606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204022.84661: stdout chunk (state=3): >>><<< 25675 1727204022.84674: stderr chunk (state=3): >>><<< 25675 1727204022.84696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204022.84959: _low_level_execute_command(): starting 25675 1727204022.84963: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/AnsiballZ_command.py && sleep 0' 25675 1727204022.86110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204022.86128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204022.86195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204022.86198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204022.86379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204022.86481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204022.86553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204023.03660: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:53:43.018704", "end": "2024-09-24 14:53:43.034905", "delta": "0:00:00.016201", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727204023.05248: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. <<< 25675 1727204023.05260: stdout chunk (state=3): >>><<< 25675 1727204023.05522: stderr chunk (state=3): >>><<< 25675 1727204023.05526: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:53:43.018704", "end": "2024-09-24 14:53:43.034905", "delta": "0:00:00.016201", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. 25675 1727204023.05528: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204023.05531: _low_level_execute_command(): starting 25675 1727204023.05533: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204022.7383015-29143-28845285855105/ > /dev/null 2>&1 && sleep 0' 25675 1727204023.06708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204023.06894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204023.06979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204023.07007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204023.07034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204023.07046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204023.07210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204023.09210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204023.09219: stdout chunk (state=3): >>><<< 25675 1727204023.09230: stderr chunk (state=3): >>><<< 25675 1727204023.09299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204023.09310: handler run complete 25675 1727204023.09335: Evaluated conditional (False): False 25675 1727204023.09568: attempt loop complete, returning result 25675 1727204023.09571: _execute() done 25675 1727204023.09574: dumping result to json 25675 1727204023.09580: done dumping result, returning 25675 1727204023.09583: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [028d2410-947f-41bd-b19d-000000000505] 25675 1727204023.09585: sending task result for task 028d2410-947f-41bd-b19d-000000000505 25675 1727204023.09653: done sending task result for task 028d2410-947f-41bd-b19d-000000000505 25675 1727204023.09655: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.016201", "end": "2024-09-24 14:53:43.034905", "rc": 1, "start": "2024-09-24 14:53:43.018704" } MSG: non-zero return code ...ignoring 25675 1727204023.09747: no more pending results, returning what we have 25675 1727204023.09751: results queue empty 25675 1727204023.09752: checking for any_errors_fatal 25675 1727204023.09758: done checking for any_errors_fatal 25675 1727204023.09759: checking for max_fail_percentage 25675 1727204023.09761: done checking for max_fail_percentage 25675 1727204023.09762: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.09762: done checking to see if all hosts have failed 25675 1727204023.09763: getting the remaining hosts for this loop 25675 1727204023.09764: done getting the remaining hosts for this loop 25675 1727204023.09768: getting the next task for host managed-node2 25675 1727204023.09781: done getting next task for host managed-node2 25675 1727204023.09785: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25675 1727204023.09789: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.09792: getting variables 25675 1727204023.09794: in VariableManager get_vars() 25675 1727204023.09824: Calling all_inventory to load vars for managed-node2 25675 1727204023.09826: Calling groups_inventory to load vars for managed-node2 25675 1727204023.09829: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.09841: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.09844: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.09847: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.13181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.16409: done with get_vars() 25675 1727204023.16434: done getting variables 25675 1727204023.16705: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.480) 0:00:42.618 ***** 25675 1727204023.16737: entering _queue_task() for managed-node2/set_fact 25675 1727204023.17531: worker is 1 (out of 1 available) 25675 1727204023.17544: exiting _queue_task() for managed-node2/set_fact 25675 1727204023.17556: done queuing things up, now waiting for results queue to drain 25675 1727204023.17558: waiting for pending results... 25675 1727204023.18092: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25675 1727204023.18104: in run() - task 028d2410-947f-41bd-b19d-000000000506 25675 1727204023.18125: variable 'ansible_search_path' from source: unknown 25675 1727204023.18132: variable 'ansible_search_path' from source: unknown 25675 1727204023.18172: calling self._execute() 25675 1727204023.18500: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.18507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.18525: variable 'omit' from source: magic vars 25675 1727204023.19307: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.19325: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.19469: variable 'nm_profile_exists' from source: set_fact 25675 1727204023.19881: Evaluated conditional (nm_profile_exists.rc == 0): False 25675 1727204023.19885: when evaluation is False, skipping this task 25675 1727204023.19887: _execute() done 25675 1727204023.19890: dumping result to json 25675 1727204023.19893: done dumping result, returning 25675 1727204023.19895: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-41bd-b19d-000000000506] 25675 1727204023.19898: sending task result for task 028d2410-947f-41bd-b19d-000000000506 25675 1727204023.19980: done sending task result for task 028d2410-947f-41bd-b19d-000000000506 25675 1727204023.19984: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 25675 1727204023.20033: no more pending results, returning what we have 25675 1727204023.20038: results queue empty 25675 1727204023.20038: checking for any_errors_fatal 25675 1727204023.20049: done checking for any_errors_fatal 25675 1727204023.20050: checking for max_fail_percentage 25675 1727204023.20052: done checking for max_fail_percentage 25675 1727204023.20052: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.20053: done checking to see if all hosts have failed 25675 1727204023.20054: getting the remaining hosts for this loop 25675 1727204023.20055: done getting the remaining hosts for this loop 25675 1727204023.20059: getting the next task for host managed-node2 25675 1727204023.20070: done getting next task for host managed-node2 25675 1727204023.20073: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 25675 1727204023.20082: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.20087: getting variables 25675 1727204023.20089: in VariableManager get_vars() 25675 1727204023.20121: Calling all_inventory to load vars for managed-node2 25675 1727204023.20124: Calling groups_inventory to load vars for managed-node2 25675 1727204023.20128: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.20142: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.20145: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.20149: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.22914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.25036: done with get_vars() 25675 1727204023.25067: done getting variables 25675 1727204023.25132: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204023.25265: variable 'profile' from source: include params 25675 1727204023.25270: variable 'interface' from source: set_fact 25675 1727204023.25360: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.086) 0:00:42.705 ***** 25675 1727204023.25405: entering _queue_task() for managed-node2/command 25675 1727204023.25982: worker is 1 (out of 1 available) 25675 1727204023.25993: exiting _queue_task() for managed-node2/command 25675 1727204023.26005: done queuing things up, now waiting for results queue to drain 25675 1727204023.26006: waiting for pending results... 25675 1727204023.26113: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 25675 1727204023.26290: in run() - task 028d2410-947f-41bd-b19d-000000000508 25675 1727204023.26311: variable 'ansible_search_path' from source: unknown 25675 1727204023.26320: variable 'ansible_search_path' from source: unknown 25675 1727204023.26368: calling self._execute() 25675 1727204023.26484: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.26530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.26534: variable 'omit' from source: magic vars 25675 1727204023.27003: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.27012: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.27144: variable 'profile_stat' from source: set_fact 25675 1727204023.27163: Evaluated conditional (profile_stat.stat.exists): False 25675 1727204023.27170: when evaluation is False, skipping this task 25675 1727204023.27221: _execute() done 25675 1727204023.27228: dumping result to json 25675 1727204023.27231: done dumping result, returning 25675 1727204023.27233: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 [028d2410-947f-41bd-b19d-000000000508] 25675 1727204023.27235: sending task result for task 028d2410-947f-41bd-b19d-000000000508 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25675 1727204023.27388: no more pending results, returning what we have 25675 1727204023.27396: results queue empty 25675 1727204023.27397: checking for any_errors_fatal 25675 1727204023.27405: done checking for any_errors_fatal 25675 1727204023.27405: checking for max_fail_percentage 25675 1727204023.27407: done checking for max_fail_percentage 25675 1727204023.27408: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.27409: done checking to see if all hosts have failed 25675 1727204023.27409: getting the remaining hosts for this loop 25675 1727204023.27411: done getting the remaining hosts for this loop 25675 1727204023.27415: getting the next task for host managed-node2 25675 1727204023.27423: done getting next task for host managed-node2 25675 1727204023.27426: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 25675 1727204023.27430: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.27437: getting variables 25675 1727204023.27439: in VariableManager get_vars() 25675 1727204023.27467: Calling all_inventory to load vars for managed-node2 25675 1727204023.27471: Calling groups_inventory to load vars for managed-node2 25675 1727204023.27477: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.27493: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.27497: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.27501: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.28398: done sending task result for task 028d2410-947f-41bd-b19d-000000000508 25675 1727204023.28402: WORKER PROCESS EXITING 25675 1727204023.31146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.34614: done with get_vars() 25675 1727204023.34645: done getting variables 25675 1727204023.34866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204023.35196: variable 'profile' from source: include params 25675 1727204023.35200: variable 'interface' from source: set_fact 25675 1727204023.35259: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.098) 0:00:42.804 ***** 25675 1727204023.35296: entering _queue_task() for managed-node2/set_fact 25675 1727204023.36097: worker is 1 (out of 1 available) 25675 1727204023.36111: exiting _queue_task() for managed-node2/set_fact 25675 1727204023.36122: done queuing things up, now waiting for results queue to drain 25675 1727204023.36123: waiting for pending results... 25675 1727204023.36782: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 25675 1727204023.37109: in run() - task 028d2410-947f-41bd-b19d-000000000509 25675 1727204023.37315: variable 'ansible_search_path' from source: unknown 25675 1727204023.37400: variable 'ansible_search_path' from source: unknown 25675 1727204023.37440: calling self._execute() 25675 1727204023.38039: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.38042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.38045: variable 'omit' from source: magic vars 25675 1727204023.38982: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.38985: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.38988: variable 'profile_stat' from source: set_fact 25675 1727204023.38990: Evaluated conditional (profile_stat.stat.exists): False 25675 1727204023.38993: when evaluation is False, skipping this task 25675 1727204023.38997: _execute() done 25675 1727204023.39000: dumping result to json 25675 1727204023.39188: done dumping result, returning 25675 1727204023.39199: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [028d2410-947f-41bd-b19d-000000000509] 25675 1727204023.39209: sending task result for task 028d2410-947f-41bd-b19d-000000000509 25675 1727204023.39329: done sending task result for task 028d2410-947f-41bd-b19d-000000000509 25675 1727204023.39337: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25675 1727204023.39389: no more pending results, returning what we have 25675 1727204023.39395: results queue empty 25675 1727204023.39396: checking for any_errors_fatal 25675 1727204023.39402: done checking for any_errors_fatal 25675 1727204023.39403: checking for max_fail_percentage 25675 1727204023.39405: done checking for max_fail_percentage 25675 1727204023.39406: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.39407: done checking to see if all hosts have failed 25675 1727204023.39407: getting the remaining hosts for this loop 25675 1727204023.39409: done getting the remaining hosts for this loop 25675 1727204023.39415: getting the next task for host managed-node2 25675 1727204023.39423: done getting next task for host managed-node2 25675 1727204023.39426: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 25675 1727204023.39430: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.39435: getting variables 25675 1727204023.39437: in VariableManager get_vars() 25675 1727204023.39469: Calling all_inventory to load vars for managed-node2 25675 1727204023.39473: Calling groups_inventory to load vars for managed-node2 25675 1727204023.39480: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.39494: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.39497: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.39499: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.42828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.46766: done with get_vars() 25675 1727204023.46907: done getting variables 25675 1727204023.46966: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204023.47292: variable 'profile' from source: include params 25675 1727204023.47296: variable 'interface' from source: set_fact 25675 1727204023.47505: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.122) 0:00:42.926 ***** 25675 1727204023.47538: entering _queue_task() for managed-node2/command 25675 1727204023.48316: worker is 1 (out of 1 available) 25675 1727204023.48328: exiting _queue_task() for managed-node2/command 25675 1727204023.48338: done queuing things up, now waiting for results queue to drain 25675 1727204023.48339: waiting for pending results... 25675 1727204023.48694: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 25675 1727204023.49784: in run() - task 028d2410-947f-41bd-b19d-00000000050a 25675 1727204023.49788: variable 'ansible_search_path' from source: unknown 25675 1727204023.49792: variable 'ansible_search_path' from source: unknown 25675 1727204023.49795: calling self._execute() 25675 1727204023.49798: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.49800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.49803: variable 'omit' from source: magic vars 25675 1727204023.50544: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.50983: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.50987: variable 'profile_stat' from source: set_fact 25675 1727204023.50990: Evaluated conditional (profile_stat.stat.exists): False 25675 1727204023.50993: when evaluation is False, skipping this task 25675 1727204023.50996: _execute() done 25675 1727204023.50998: dumping result to json 25675 1727204023.51001: done dumping result, returning 25675 1727204023.51004: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 [028d2410-947f-41bd-b19d-00000000050a] 25675 1727204023.51006: sending task result for task 028d2410-947f-41bd-b19d-00000000050a 25675 1727204023.51447: done sending task result for task 028d2410-947f-41bd-b19d-00000000050a 25675 1727204023.51452: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25675 1727204023.51503: no more pending results, returning what we have 25675 1727204023.51507: results queue empty 25675 1727204023.51508: checking for any_errors_fatal 25675 1727204023.51513: done checking for any_errors_fatal 25675 1727204023.51514: checking for max_fail_percentage 25675 1727204023.51515: done checking for max_fail_percentage 25675 1727204023.51516: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.51517: done checking to see if all hosts have failed 25675 1727204023.51517: getting the remaining hosts for this loop 25675 1727204023.51519: done getting the remaining hosts for this loop 25675 1727204023.51522: getting the next task for host managed-node2 25675 1727204023.51529: done getting next task for host managed-node2 25675 1727204023.51531: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 25675 1727204023.51534: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.51538: getting variables 25675 1727204023.51539: in VariableManager get_vars() 25675 1727204023.51568: Calling all_inventory to load vars for managed-node2 25675 1727204023.51570: Calling groups_inventory to load vars for managed-node2 25675 1727204023.51574: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.51588: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.51591: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.51595: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.54503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.57605: done with get_vars() 25675 1727204023.57636: done getting variables 25675 1727204023.57700: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204023.58113: variable 'profile' from source: include params 25675 1727204023.58116: variable 'interface' from source: set_fact 25675 1727204023.58169: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.106) 0:00:43.033 ***** 25675 1727204023.58201: entering _queue_task() for managed-node2/set_fact 25675 1727204023.58944: worker is 1 (out of 1 available) 25675 1727204023.58957: exiting _queue_task() for managed-node2/set_fact 25675 1727204023.58971: done queuing things up, now waiting for results queue to drain 25675 1727204023.58972: waiting for pending results... 25675 1727204023.59447: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 25675 1727204023.59787: in run() - task 028d2410-947f-41bd-b19d-00000000050b 25675 1727204023.59813: variable 'ansible_search_path' from source: unknown 25675 1727204023.59822: variable 'ansible_search_path' from source: unknown 25675 1727204023.59884: calling self._execute() 25675 1727204023.60035: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.60155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.60175: variable 'omit' from source: magic vars 25675 1727204023.61003: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.61158: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.61304: variable 'profile_stat' from source: set_fact 25675 1727204023.61435: Evaluated conditional (profile_stat.stat.exists): False 25675 1727204023.61445: when evaluation is False, skipping this task 25675 1727204023.61453: _execute() done 25675 1727204023.61461: dumping result to json 25675 1727204023.61477: done dumping result, returning 25675 1727204023.61493: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 [028d2410-947f-41bd-b19d-00000000050b] 25675 1727204023.61503: sending task result for task 028d2410-947f-41bd-b19d-00000000050b skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25675 1727204023.61712: no more pending results, returning what we have 25675 1727204023.61717: results queue empty 25675 1727204023.61718: checking for any_errors_fatal 25675 1727204023.61725: done checking for any_errors_fatal 25675 1727204023.61725: checking for max_fail_percentage 25675 1727204023.61727: done checking for max_fail_percentage 25675 1727204023.61727: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.61728: done checking to see if all hosts have failed 25675 1727204023.61729: getting the remaining hosts for this loop 25675 1727204023.61730: done getting the remaining hosts for this loop 25675 1727204023.61734: getting the next task for host managed-node2 25675 1727204023.61742: done getting next task for host managed-node2 25675 1727204023.61745: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 25675 1727204023.61748: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.61752: getting variables 25675 1727204023.61754: in VariableManager get_vars() 25675 1727204023.61784: Calling all_inventory to load vars for managed-node2 25675 1727204023.61786: Calling groups_inventory to load vars for managed-node2 25675 1727204023.61791: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.61805: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.61809: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.61812: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.62489: done sending task result for task 028d2410-947f-41bd-b19d-00000000050b 25675 1727204023.62493: WORKER PROCESS EXITING 25675 1727204023.64095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.66309: done with get_vars() 25675 1727204023.66338: done getting variables 25675 1727204023.66408: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204023.66529: variable 'profile' from source: include params 25675 1727204023.66533: variable 'interface' from source: set_fact 25675 1727204023.66603: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.084) 0:00:43.117 ***** 25675 1727204023.66633: entering _queue_task() for managed-node2/assert 25675 1727204023.66968: worker is 1 (out of 1 available) 25675 1727204023.66987: exiting _queue_task() for managed-node2/assert 25675 1727204023.66999: done queuing things up, now waiting for results queue to drain 25675 1727204023.67001: waiting for pending results... 25675 1727204023.67303: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' 25675 1727204023.67422: in run() - task 028d2410-947f-41bd-b19d-0000000004f6 25675 1727204023.67426: variable 'ansible_search_path' from source: unknown 25675 1727204023.67428: variable 'ansible_search_path' from source: unknown 25675 1727204023.67446: calling self._execute() 25675 1727204023.67586: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.67590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.67593: variable 'omit' from source: magic vars 25675 1727204023.68015: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.68049: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.68182: variable 'omit' from source: magic vars 25675 1727204023.68186: variable 'omit' from source: magic vars 25675 1727204023.68218: variable 'profile' from source: include params 25675 1727204023.68229: variable 'interface' from source: set_fact 25675 1727204023.68297: variable 'interface' from source: set_fact 25675 1727204023.68383: variable 'omit' from source: magic vars 25675 1727204023.68387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204023.68422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204023.68462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204023.68490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204023.68507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204023.68547: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204023.68631: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.68639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.68694: Set connection var ansible_shell_type to sh 25675 1727204023.68705: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204023.68716: Set connection var ansible_timeout to 10 25675 1727204023.68730: Set connection var ansible_pipelining to False 25675 1727204023.68755: Set connection var ansible_shell_executable to /bin/sh 25675 1727204023.68817: Set connection var ansible_connection to ssh 25675 1727204023.68852: variable 'ansible_shell_executable' from source: unknown 25675 1727204023.68960: variable 'ansible_connection' from source: unknown 25675 1727204023.68963: variable 'ansible_module_compression' from source: unknown 25675 1727204023.68966: variable 'ansible_shell_type' from source: unknown 25675 1727204023.68968: variable 'ansible_shell_executable' from source: unknown 25675 1727204023.68970: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.68972: variable 'ansible_pipelining' from source: unknown 25675 1727204023.68974: variable 'ansible_timeout' from source: unknown 25675 1727204023.68982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.69048: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204023.69070: variable 'omit' from source: magic vars 25675 1727204023.69094: starting attempt loop 25675 1727204023.69102: running the handler 25675 1727204023.69240: variable 'lsr_net_profile_exists' from source: set_fact 25675 1727204023.69250: Evaluated conditional (not lsr_net_profile_exists): True 25675 1727204023.69260: handler run complete 25675 1727204023.69286: attempt loop complete, returning result 25675 1727204023.69294: _execute() done 25675 1727204023.69301: dumping result to json 25675 1727204023.69309: done dumping result, returning 25675 1727204023.69319: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' [028d2410-947f-41bd-b19d-0000000004f6] 25675 1727204023.69327: sending task result for task 028d2410-947f-41bd-b19d-0000000004f6 25675 1727204023.69530: done sending task result for task 028d2410-947f-41bd-b19d-0000000004f6 25675 1727204023.69534: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 25675 1727204023.69581: no more pending results, returning what we have 25675 1727204023.69585: results queue empty 25675 1727204023.69586: checking for any_errors_fatal 25675 1727204023.69594: done checking for any_errors_fatal 25675 1727204023.69595: checking for max_fail_percentage 25675 1727204023.69596: done checking for max_fail_percentage 25675 1727204023.69597: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.69598: done checking to see if all hosts have failed 25675 1727204023.69598: getting the remaining hosts for this loop 25675 1727204023.69600: done getting the remaining hosts for this loop 25675 1727204023.69605: getting the next task for host managed-node2 25675 1727204023.69613: done getting next task for host managed-node2 25675 1727204023.69616: ^ task is: TASK: Include the task 'assert_device_absent.yml' 25675 1727204023.69617: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.69621: getting variables 25675 1727204023.69622: in VariableManager get_vars() 25675 1727204023.69651: Calling all_inventory to load vars for managed-node2 25675 1727204023.69654: Calling groups_inventory to load vars for managed-node2 25675 1727204023.69658: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.69667: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.69670: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.69673: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.72500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.75958: done with get_vars() 25675 1727204023.75992: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.095) 0:00:43.213 ***** 25675 1727204023.76201: entering _queue_task() for managed-node2/include_tasks 25675 1727204023.76908: worker is 1 (out of 1 available) 25675 1727204023.76920: exiting _queue_task() for managed-node2/include_tasks 25675 1727204023.76984: done queuing things up, now waiting for results queue to drain 25675 1727204023.76989: waiting for pending results... 25675 1727204023.77495: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 25675 1727204023.77592: in run() - task 028d2410-947f-41bd-b19d-000000000075 25675 1727204023.77701: variable 'ansible_search_path' from source: unknown 25675 1727204023.77704: calling self._execute() 25675 1727204023.77746: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.77757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.77769: variable 'omit' from source: magic vars 25675 1727204023.78167: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.78187: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.78198: _execute() done 25675 1727204023.78208: dumping result to json 25675 1727204023.78216: done dumping result, returning 25675 1727204023.78227: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [028d2410-947f-41bd-b19d-000000000075] 25675 1727204023.78242: sending task result for task 028d2410-947f-41bd-b19d-000000000075 25675 1727204023.78383: no more pending results, returning what we have 25675 1727204023.78389: in VariableManager get_vars() 25675 1727204023.78426: Calling all_inventory to load vars for managed-node2 25675 1727204023.78429: Calling groups_inventory to load vars for managed-node2 25675 1727204023.78433: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.78447: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.78451: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.78453: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.79300: done sending task result for task 028d2410-947f-41bd-b19d-000000000075 25675 1727204023.79304: WORKER PROCESS EXITING 25675 1727204023.80319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.81932: done with get_vars() 25675 1727204023.81951: variable 'ansible_search_path' from source: unknown 25675 1727204023.81966: we have included files to process 25675 1727204023.81967: generating all_blocks data 25675 1727204023.81969: done generating all_blocks data 25675 1727204023.81974: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 25675 1727204023.81977: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 25675 1727204023.81980: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 25675 1727204023.82139: in VariableManager get_vars() 25675 1727204023.82160: done with get_vars() 25675 1727204023.82268: done processing included file 25675 1727204023.82270: iterating over new_blocks loaded from include file 25675 1727204023.82272: in VariableManager get_vars() 25675 1727204023.82284: done with get_vars() 25675 1727204023.82286: filtering new block on tags 25675 1727204023.82303: done filtering new block on tags 25675 1727204023.82305: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 25675 1727204023.82309: extending task lists for all hosts with included blocks 25675 1727204023.82448: done extending task lists 25675 1727204023.82449: done processing included files 25675 1727204023.82450: results queue empty 25675 1727204023.82451: checking for any_errors_fatal 25675 1727204023.82454: done checking for any_errors_fatal 25675 1727204023.82455: checking for max_fail_percentage 25675 1727204023.82456: done checking for max_fail_percentage 25675 1727204023.82457: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.82458: done checking to see if all hosts have failed 25675 1727204023.82458: getting the remaining hosts for this loop 25675 1727204023.82459: done getting the remaining hosts for this loop 25675 1727204023.82462: getting the next task for host managed-node2 25675 1727204023.82465: done getting next task for host managed-node2 25675 1727204023.82467: ^ task is: TASK: Include the task 'get_interface_stat.yml' 25675 1727204023.82474: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.82478: getting variables 25675 1727204023.82479: in VariableManager get_vars() 25675 1727204023.82487: Calling all_inventory to load vars for managed-node2 25675 1727204023.82489: Calling groups_inventory to load vars for managed-node2 25675 1727204023.82491: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.82496: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.82499: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.82501: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.83710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.86124: done with get_vars() 25675 1727204023.86146: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.102) 0:00:43.315 ***** 25675 1727204023.86442: entering _queue_task() for managed-node2/include_tasks 25675 1727204023.86995: worker is 1 (out of 1 available) 25675 1727204023.87007: exiting _queue_task() for managed-node2/include_tasks 25675 1727204023.87132: done queuing things up, now waiting for results queue to drain 25675 1727204023.87135: waiting for pending results... 25675 1727204023.87478: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 25675 1727204023.87756: in run() - task 028d2410-947f-41bd-b19d-00000000053c 25675 1727204023.87876: variable 'ansible_search_path' from source: unknown 25675 1727204023.87883: variable 'ansible_search_path' from source: unknown 25675 1727204023.87896: calling self._execute() 25675 1727204023.88173: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204023.88252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204023.88256: variable 'omit' from source: magic vars 25675 1727204023.89084: variable 'ansible_distribution_major_version' from source: facts 25675 1727204023.89087: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204023.89090: _execute() done 25675 1727204023.89092: dumping result to json 25675 1727204023.89094: done dumping result, returning 25675 1727204023.89097: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-41bd-b19d-00000000053c] 25675 1727204023.89100: sending task result for task 028d2410-947f-41bd-b19d-00000000053c 25675 1727204023.89209: no more pending results, returning what we have 25675 1727204023.89215: in VariableManager get_vars() 25675 1727204023.89253: Calling all_inventory to load vars for managed-node2 25675 1727204023.89255: Calling groups_inventory to load vars for managed-node2 25675 1727204023.89259: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.89272: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.89277: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.89280: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.90036: done sending task result for task 028d2410-947f-41bd-b19d-00000000053c 25675 1727204023.90040: WORKER PROCESS EXITING 25675 1727204023.91925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.95322: done with get_vars() 25675 1727204023.95349: variable 'ansible_search_path' from source: unknown 25675 1727204023.95351: variable 'ansible_search_path' from source: unknown 25675 1727204023.95391: we have included files to process 25675 1727204023.95392: generating all_blocks data 25675 1727204023.95394: done generating all_blocks data 25675 1727204023.95395: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727204023.95396: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727204023.95398: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25675 1727204023.95879: done processing included file 25675 1727204023.95881: iterating over new_blocks loaded from include file 25675 1727204023.95882: in VariableManager get_vars() 25675 1727204023.95895: done with get_vars() 25675 1727204023.95897: filtering new block on tags 25675 1727204023.95912: done filtering new block on tags 25675 1727204023.95914: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 25675 1727204023.95919: extending task lists for all hosts with included blocks 25675 1727204023.96024: done extending task lists 25675 1727204023.96025: done processing included files 25675 1727204023.96026: results queue empty 25675 1727204023.96027: checking for any_errors_fatal 25675 1727204023.96030: done checking for any_errors_fatal 25675 1727204023.96031: checking for max_fail_percentage 25675 1727204023.96032: done checking for max_fail_percentage 25675 1727204023.96032: checking to see if all hosts have failed and the running result is not ok 25675 1727204023.96033: done checking to see if all hosts have failed 25675 1727204023.96034: getting the remaining hosts for this loop 25675 1727204023.96035: done getting the remaining hosts for this loop 25675 1727204023.96037: getting the next task for host managed-node2 25675 1727204023.96041: done getting next task for host managed-node2 25675 1727204023.96044: ^ task is: TASK: Get stat for interface {{ interface }} 25675 1727204023.96046: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204023.96049: getting variables 25675 1727204023.96050: in VariableManager get_vars() 25675 1727204023.96058: Calling all_inventory to load vars for managed-node2 25675 1727204023.96061: Calling groups_inventory to load vars for managed-node2 25675 1727204023.96063: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204023.96068: Calling all_plugins_play to load vars for managed-node2 25675 1727204023.96071: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204023.96082: Calling groups_plugins_play to load vars for managed-node2 25675 1727204023.97356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204023.98970: done with get_vars() 25675 1727204023.98993: done getting variables 25675 1727204023.99152: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:43 -0400 (0:00:00.127) 0:00:43.443 ***** 25675 1727204023.99184: entering _queue_task() for managed-node2/stat 25675 1727204023.99540: worker is 1 (out of 1 available) 25675 1727204023.99664: exiting _queue_task() for managed-node2/stat 25675 1727204023.99674: done queuing things up, now waiting for results queue to drain 25675 1727204023.99676: waiting for pending results... 25675 1727204023.99849: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 25675 1727204023.99972: in run() - task 028d2410-947f-41bd-b19d-000000000554 25675 1727204023.99999: variable 'ansible_search_path' from source: unknown 25675 1727204024.00011: variable 'ansible_search_path' from source: unknown 25675 1727204024.00050: calling self._execute() 25675 1727204024.00151: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.00163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.00179: variable 'omit' from source: magic vars 25675 1727204024.00558: variable 'ansible_distribution_major_version' from source: facts 25675 1727204024.00574: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204024.00588: variable 'omit' from source: magic vars 25675 1727204024.00659: variable 'omit' from source: magic vars 25675 1727204024.00748: variable 'interface' from source: set_fact 25675 1727204024.00774: variable 'omit' from source: magic vars 25675 1727204024.00852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204024.00867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204024.00900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204024.00922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.00961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.00984: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204024.00993: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.01000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.01181: Set connection var ansible_shell_type to sh 25675 1727204024.01184: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204024.01186: Set connection var ansible_timeout to 10 25675 1727204024.01189: Set connection var ansible_pipelining to False 25675 1727204024.01191: Set connection var ansible_shell_executable to /bin/sh 25675 1727204024.01195: Set connection var ansible_connection to ssh 25675 1727204024.01197: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.01199: variable 'ansible_connection' from source: unknown 25675 1727204024.01201: variable 'ansible_module_compression' from source: unknown 25675 1727204024.01203: variable 'ansible_shell_type' from source: unknown 25675 1727204024.01205: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.01216: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.01226: variable 'ansible_pipelining' from source: unknown 25675 1727204024.01233: variable 'ansible_timeout' from source: unknown 25675 1727204024.01242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.01454: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25675 1727204024.01470: variable 'omit' from source: magic vars 25675 1727204024.01506: starting attempt loop 25675 1727204024.01509: running the handler 25675 1727204024.01512: _low_level_execute_command(): starting 25675 1727204024.01519: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204024.02353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204024.02387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.02630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.02931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.04603: stdout chunk (state=3): >>>/root <<< 25675 1727204024.04882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.04887: stdout chunk (state=3): >>><<< 25675 1727204024.04889: stderr chunk (state=3): >>><<< 25675 1727204024.04892: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.05008: _low_level_execute_command(): starting 25675 1727204024.05012: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212 `" && echo ansible-tmp-1727204024.048653-29250-121508032590212="` echo /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212 `" ) && sleep 0' 25675 1727204024.06293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.06310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.06481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.06485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.08399: stdout chunk (state=3): >>>ansible-tmp-1727204024.048653-29250-121508032590212=/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212 <<< 25675 1727204024.08500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.08556: stderr chunk (state=3): >>><<< 25675 1727204024.08591: stdout chunk (state=3): >>><<< 25675 1727204024.08615: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204024.048653-29250-121508032590212=/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.08659: variable 'ansible_module_compression' from source: unknown 25675 1727204024.08722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25675 1727204024.08760: variable 'ansible_facts' from source: unknown 25675 1727204024.09140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py 25675 1727204024.09321: Sending initial data 25675 1727204024.09324: Sent initial data (152 bytes) 25675 1727204024.10782: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.10786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.10789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.10791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.10793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.12409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204024.12494: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204024.12593: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpx4a9j1k0 /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py <<< 25675 1727204024.12596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py" <<< 25675 1727204024.12689: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpx4a9j1k0" to remote "/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py" <<< 25675 1727204024.13579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.13678: stderr chunk (state=3): >>><<< 25675 1727204024.14086: stdout chunk (state=3): >>><<< 25675 1727204024.14089: done transferring module to remote 25675 1727204024.14091: _low_level_execute_command(): starting 25675 1727204024.14093: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/ /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py && sleep 0' 25675 1727204024.14808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204024.14821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.14879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.15023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.15035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.15149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.17014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.17018: stdout chunk (state=3): >>><<< 25675 1727204024.17028: stderr chunk (state=3): >>><<< 25675 1727204024.17066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.17074: _low_level_execute_command(): starting 25675 1727204024.17078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/AnsiballZ_stat.py && sleep 0' 25675 1727204024.17707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204024.17717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204024.17728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.17752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204024.17765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204024.17772: stderr chunk (state=3): >>>debug2: match not found <<< 25675 1727204024.17792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.17838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204024.17856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.17904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.17916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.17962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.18031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.33329: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25675 1727204024.34848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204024.34852: stdout chunk (state=3): >>><<< 25675 1727204024.34854: stderr chunk (state=3): >>><<< 25675 1727204024.34857: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204024.34859: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204024.34861: _low_level_execute_command(): starting 25675 1727204024.34863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204024.048653-29250-121508032590212/ > /dev/null 2>&1 && sleep 0' 25675 1727204024.36028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.36032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.36069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.36097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.36124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.36272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.38250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.38253: stdout chunk (state=3): >>><<< 25675 1727204024.38256: stderr chunk (state=3): >>><<< 25675 1727204024.38481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.38490: handler run complete 25675 1727204024.38493: attempt loop complete, returning result 25675 1727204024.38495: _execute() done 25675 1727204024.38497: dumping result to json 25675 1727204024.38499: done dumping result, returning 25675 1727204024.38502: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [028d2410-947f-41bd-b19d-000000000554] 25675 1727204024.38504: sending task result for task 028d2410-947f-41bd-b19d-000000000554 25675 1727204024.38580: done sending task result for task 028d2410-947f-41bd-b19d-000000000554 25675 1727204024.38583: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 25675 1727204024.38646: no more pending results, returning what we have 25675 1727204024.38651: results queue empty 25675 1727204024.38651: checking for any_errors_fatal 25675 1727204024.38653: done checking for any_errors_fatal 25675 1727204024.38654: checking for max_fail_percentage 25675 1727204024.38656: done checking for max_fail_percentage 25675 1727204024.38657: checking to see if all hosts have failed and the running result is not ok 25675 1727204024.38658: done checking to see if all hosts have failed 25675 1727204024.38659: getting the remaining hosts for this loop 25675 1727204024.38660: done getting the remaining hosts for this loop 25675 1727204024.38665: getting the next task for host managed-node2 25675 1727204024.38677: done getting next task for host managed-node2 25675 1727204024.38680: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 25675 1727204024.38683: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.38689: getting variables 25675 1727204024.38690: in VariableManager get_vars() 25675 1727204024.38724: Calling all_inventory to load vars for managed-node2 25675 1727204024.38727: Calling groups_inventory to load vars for managed-node2 25675 1727204024.38731: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.38743: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.38747: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.38750: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.42182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.44527: done with get_vars() 25675 1727204024.44565: done getting variables 25675 1727204024.44652: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25675 1727204024.44883: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:53:44 -0400 (0:00:00.457) 0:00:43.900 ***** 25675 1727204024.44919: entering _queue_task() for managed-node2/assert 25675 1727204024.45364: worker is 1 (out of 1 available) 25675 1727204024.45378: exiting _queue_task() for managed-node2/assert 25675 1727204024.45390: done queuing things up, now waiting for results queue to drain 25675 1727204024.45391: waiting for pending results... 25675 1727204024.45968: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' 25675 1727204024.46062: in run() - task 028d2410-947f-41bd-b19d-00000000053d 25675 1727204024.46189: variable 'ansible_search_path' from source: unknown 25675 1727204024.46192: variable 'ansible_search_path' from source: unknown 25675 1727204024.46294: calling self._execute() 25675 1727204024.46391: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.46402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.46416: variable 'omit' from source: magic vars 25675 1727204024.46988: variable 'ansible_distribution_major_version' from source: facts 25675 1727204024.47111: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204024.47114: variable 'omit' from source: magic vars 25675 1727204024.47281: variable 'omit' from source: magic vars 25675 1727204024.47339: variable 'interface' from source: set_fact 25675 1727204024.47440: variable 'omit' from source: magic vars 25675 1727204024.47538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204024.47590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204024.47617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204024.47620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.47622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.47667: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204024.47672: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.47674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.47763: Set connection var ansible_shell_type to sh 25675 1727204024.47766: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204024.47768: Set connection var ansible_timeout to 10 25675 1727204024.47770: Set connection var ansible_pipelining to False 25675 1727204024.47781: Set connection var ansible_shell_executable to /bin/sh 25675 1727204024.47786: Set connection var ansible_connection to ssh 25675 1727204024.47822: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.47826: variable 'ansible_connection' from source: unknown 25675 1727204024.47865: variable 'ansible_module_compression' from source: unknown 25675 1727204024.47868: variable 'ansible_shell_type' from source: unknown 25675 1727204024.47870: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.47873: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.47874: variable 'ansible_pipelining' from source: unknown 25675 1727204024.47878: variable 'ansible_timeout' from source: unknown 25675 1727204024.47880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.48007: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204024.48023: variable 'omit' from source: magic vars 25675 1727204024.48027: starting attempt loop 25675 1727204024.48030: running the handler 25675 1727204024.48174: variable 'interface_stat' from source: set_fact 25675 1727204024.48179: Evaluated conditional (not interface_stat.stat.exists): True 25675 1727204024.48182: handler run complete 25675 1727204024.48217: attempt loop complete, returning result 25675 1727204024.48222: _execute() done 25675 1727204024.48225: dumping result to json 25675 1727204024.48227: done dumping result, returning 25675 1727204024.48229: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' [028d2410-947f-41bd-b19d-00000000053d] 25675 1727204024.48231: sending task result for task 028d2410-947f-41bd-b19d-00000000053d ok: [managed-node2] => { "changed": false } MSG: All assertions passed 25675 1727204024.48457: no more pending results, returning what we have 25675 1727204024.48461: results queue empty 25675 1727204024.48462: checking for any_errors_fatal 25675 1727204024.48468: done checking for any_errors_fatal 25675 1727204024.48472: checking for max_fail_percentage 25675 1727204024.48474: done checking for max_fail_percentage 25675 1727204024.48474: checking to see if all hosts have failed and the running result is not ok 25675 1727204024.48477: done checking to see if all hosts have failed 25675 1727204024.48478: getting the remaining hosts for this loop 25675 1727204024.48479: done getting the remaining hosts for this loop 25675 1727204024.48483: getting the next task for host managed-node2 25675 1727204024.48493: done getting next task for host managed-node2 25675 1727204024.48494: ^ task is: TASK: meta (flush_handlers) 25675 1727204024.48498: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.48501: getting variables 25675 1727204024.48502: in VariableManager get_vars() 25675 1727204024.48526: Calling all_inventory to load vars for managed-node2 25675 1727204024.48528: Calling groups_inventory to load vars for managed-node2 25675 1727204024.48534: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.48542: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.48545: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.48547: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.49089: done sending task result for task 028d2410-947f-41bd-b19d-00000000053d 25675 1727204024.49092: WORKER PROCESS EXITING 25675 1727204024.50421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.51598: done with get_vars() 25675 1727204024.51614: done getting variables 25675 1727204024.51663: in VariableManager get_vars() 25675 1727204024.51669: Calling all_inventory to load vars for managed-node2 25675 1727204024.51671: Calling groups_inventory to load vars for managed-node2 25675 1727204024.51672: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.51677: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.51681: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.51683: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.52427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.55353: done with get_vars() 25675 1727204024.55401: done queuing things up, now waiting for results queue to drain 25675 1727204024.55404: results queue empty 25675 1727204024.55405: checking for any_errors_fatal 25675 1727204024.55407: done checking for any_errors_fatal 25675 1727204024.55408: checking for max_fail_percentage 25675 1727204024.55410: done checking for max_fail_percentage 25675 1727204024.55410: checking to see if all hosts have failed and the running result is not ok 25675 1727204024.55411: done checking to see if all hosts have failed 25675 1727204024.55417: getting the remaining hosts for this loop 25675 1727204024.55418: done getting the remaining hosts for this loop 25675 1727204024.55421: getting the next task for host managed-node2 25675 1727204024.55425: done getting next task for host managed-node2 25675 1727204024.55427: ^ task is: TASK: meta (flush_handlers) 25675 1727204024.55429: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.55431: getting variables 25675 1727204024.55432: in VariableManager get_vars() 25675 1727204024.55442: Calling all_inventory to load vars for managed-node2 25675 1727204024.55445: Calling groups_inventory to load vars for managed-node2 25675 1727204024.55447: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.55452: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.55455: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.55458: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.58365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.60413: done with get_vars() 25675 1727204024.60438: done getting variables 25675 1727204024.60508: in VariableManager get_vars() 25675 1727204024.60518: Calling all_inventory to load vars for managed-node2 25675 1727204024.60521: Calling groups_inventory to load vars for managed-node2 25675 1727204024.60523: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.60527: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.60533: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.60536: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.61709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.62847: done with get_vars() 25675 1727204024.62864: done queuing things up, now waiting for results queue to drain 25675 1727204024.62866: results queue empty 25675 1727204024.62866: checking for any_errors_fatal 25675 1727204024.62867: done checking for any_errors_fatal 25675 1727204024.62868: checking for max_fail_percentage 25675 1727204024.62869: done checking for max_fail_percentage 25675 1727204024.62869: checking to see if all hosts have failed and the running result is not ok 25675 1727204024.62870: done checking to see if all hosts have failed 25675 1727204024.62870: getting the remaining hosts for this loop 25675 1727204024.62871: done getting the remaining hosts for this loop 25675 1727204024.62873: getting the next task for host managed-node2 25675 1727204024.62877: done getting next task for host managed-node2 25675 1727204024.62877: ^ task is: None 25675 1727204024.62880: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.62881: done queuing things up, now waiting for results queue to drain 25675 1727204024.62882: results queue empty 25675 1727204024.62882: checking for any_errors_fatal 25675 1727204024.62882: done checking for any_errors_fatal 25675 1727204024.62883: checking for max_fail_percentage 25675 1727204024.62883: done checking for max_fail_percentage 25675 1727204024.62884: checking to see if all hosts have failed and the running result is not ok 25675 1727204024.62884: done checking to see if all hosts have failed 25675 1727204024.62885: getting the next task for host managed-node2 25675 1727204024.62887: done getting next task for host managed-node2 25675 1727204024.62887: ^ task is: None 25675 1727204024.62888: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.62920: in VariableManager get_vars() 25675 1727204024.62931: done with get_vars() 25675 1727204024.62934: in VariableManager get_vars() 25675 1727204024.62940: done with get_vars() 25675 1727204024.62943: variable 'omit' from source: magic vars 25675 1727204024.62964: in VariableManager get_vars() 25675 1727204024.62970: done with get_vars() 25675 1727204024.62988: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 25675 1727204024.63106: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25675 1727204024.63128: getting the remaining hosts for this loop 25675 1727204024.63131: done getting the remaining hosts for this loop 25675 1727204024.63134: getting the next task for host managed-node2 25675 1727204024.63136: done getting next task for host managed-node2 25675 1727204024.63141: ^ task is: TASK: Gathering Facts 25675 1727204024.63143: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204024.63145: getting variables 25675 1727204024.63146: in VariableManager get_vars() 25675 1727204024.63155: Calling all_inventory to load vars for managed-node2 25675 1727204024.63157: Calling groups_inventory to load vars for managed-node2 25675 1727204024.63159: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204024.63164: Calling all_plugins_play to load vars for managed-node2 25675 1727204024.63166: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204024.63172: Calling groups_plugins_play to load vars for managed-node2 25675 1727204024.64331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204024.65185: done with get_vars() 25675 1727204024.65198: done getting variables 25675 1727204024.65229: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Tuesday 24 September 2024 14:53:44 -0400 (0:00:00.203) 0:00:44.103 ***** 25675 1727204024.65246: entering _queue_task() for managed-node2/gather_facts 25675 1727204024.65491: worker is 1 (out of 1 available) 25675 1727204024.65505: exiting _queue_task() for managed-node2/gather_facts 25675 1727204024.65516: done queuing things up, now waiting for results queue to drain 25675 1727204024.65517: waiting for pending results... 25675 1727204024.65685: running TaskExecutor() for managed-node2/TASK: Gathering Facts 25675 1727204024.65748: in run() - task 028d2410-947f-41bd-b19d-00000000056d 25675 1727204024.65763: variable 'ansible_search_path' from source: unknown 25675 1727204024.65793: calling self._execute() 25675 1727204024.65858: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.65863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.65874: variable 'omit' from source: magic vars 25675 1727204024.66147: variable 'ansible_distribution_major_version' from source: facts 25675 1727204024.66156: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204024.66161: variable 'omit' from source: magic vars 25675 1727204024.66183: variable 'omit' from source: magic vars 25675 1727204024.66210: variable 'omit' from source: magic vars 25675 1727204024.66244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204024.66271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204024.66291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204024.66307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.66317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204024.66340: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204024.66343: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.66345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.66416: Set connection var ansible_shell_type to sh 25675 1727204024.66420: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204024.66425: Set connection var ansible_timeout to 10 25675 1727204024.66430: Set connection var ansible_pipelining to False 25675 1727204024.66435: Set connection var ansible_shell_executable to /bin/sh 25675 1727204024.66438: Set connection var ansible_connection to ssh 25675 1727204024.66466: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.66469: variable 'ansible_connection' from source: unknown 25675 1727204024.66471: variable 'ansible_module_compression' from source: unknown 25675 1727204024.66474: variable 'ansible_shell_type' from source: unknown 25675 1727204024.66481: variable 'ansible_shell_executable' from source: unknown 25675 1727204024.66483: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204024.66485: variable 'ansible_pipelining' from source: unknown 25675 1727204024.66487: variable 'ansible_timeout' from source: unknown 25675 1727204024.66489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204024.66619: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204024.66633: variable 'omit' from source: magic vars 25675 1727204024.66637: starting attempt loop 25675 1727204024.66639: running the handler 25675 1727204024.66651: variable 'ansible_facts' from source: unknown 25675 1727204024.66666: _low_level_execute_command(): starting 25675 1727204024.66673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204024.67156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.67179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.67194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.67245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.67249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.67333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.69073: stdout chunk (state=3): >>>/root <<< 25675 1727204024.69173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.69206: stderr chunk (state=3): >>><<< 25675 1727204024.69209: stdout chunk (state=3): >>><<< 25675 1727204024.69230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.69239: _low_level_execute_command(): starting 25675 1727204024.69245: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247 `" && echo ansible-tmp-1727204024.6922755-29284-111480338593247="` echo /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247 `" ) && sleep 0' 25675 1727204024.69682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204024.69685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204024.69688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 25675 1727204024.69697: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204024.69699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.69736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.69739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.69817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.71746: stdout chunk (state=3): >>>ansible-tmp-1727204024.6922755-29284-111480338593247=/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247 <<< 25675 1727204024.71848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.71873: stderr chunk (state=3): >>><<< 25675 1727204024.71878: stdout chunk (state=3): >>><<< 25675 1727204024.71896: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204024.6922755-29284-111480338593247=/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.71917: variable 'ansible_module_compression' from source: unknown 25675 1727204024.71953: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25675 1727204024.72002: variable 'ansible_facts' from source: unknown 25675 1727204024.72129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py 25675 1727204024.72222: Sending initial data 25675 1727204024.72225: Sent initial data (154 bytes) 25675 1727204024.72656: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.72659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204024.72663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.72666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.72668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.72722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.72725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204024.72729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.72801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.74382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25675 1727204024.74387: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204024.74448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204024.74523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpkgpgaacw /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py <<< 25675 1727204024.74526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py" <<< 25675 1727204024.74594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 25675 1727204024.74598: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpkgpgaacw" to remote "/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py" <<< 25675 1727204024.75767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.75813: stderr chunk (state=3): >>><<< 25675 1727204024.75816: stdout chunk (state=3): >>><<< 25675 1727204024.75833: done transferring module to remote 25675 1727204024.75842: _low_level_execute_command(): starting 25675 1727204024.75847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/ /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py && sleep 0' 25675 1727204024.76289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.76292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.76294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25675 1727204024.76297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204024.76302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.76339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.76342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.76418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204024.78224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204024.78247: stderr chunk (state=3): >>><<< 25675 1727204024.78250: stdout chunk (state=3): >>><<< 25675 1727204024.78264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204024.78266: _low_level_execute_command(): starting 25675 1727204024.78270: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/AnsiballZ_setup.py && sleep 0' 25675 1727204024.78701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204024.78704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204024.78707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204024.78709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204024.78753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204024.78756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204024.78836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.43776: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2927, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 604, "free": 2927}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.ama<<< 25675 1727204025.43798: stdout chunk (state=3): >>>zon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 611, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785747456, "block_size": 4096, "block_total": 65519099, "block_available": 63912536, "block_used": 1606563, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "45", "epoch": "1727204025", "epoch_int": "1727204025", "date": "2024-09-24", "time": "14:53:45", "iso8601_micro": "2024-09-24T18:53:45.395499Z", "iso8601": "2024-09-24T18:53:45Z", "iso8601_basic": "20240924T145345395499", "iso8601_basic_short": "20240924T145345", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.45263671875, "5m": 0.42626953125, "15m": 0.23193359375}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25675 1727204025.45749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204025.45816: stderr chunk (state=3): >>><<< 25675 1727204025.45823: stdout chunk (state=3): >>><<< 25675 1727204025.45910: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2927, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 604, "free": 2927}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 611, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785747456, "block_size": 4096, "block_total": 65519099, "block_available": 63912536, "block_used": 1606563, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "45", "epoch": "1727204025", "epoch_int": "1727204025", "date": "2024-09-24", "time": "14:53:45", "iso8601_micro": "2024-09-24T18:53:45.395499Z", "iso8601": "2024-09-24T18:53:45Z", "iso8601_basic": "20240924T145345395499", "iso8601_basic_short": "20240924T145345", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.45263671875, "5m": 0.42626953125, "15m": 0.23193359375}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204025.46167: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204025.46170: _low_level_execute_command(): starting 25675 1727204025.46173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204024.6922755-29284-111480338593247/ > /dev/null 2>&1 && sleep 0' 25675 1727204025.46549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204025.46562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.46573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.46620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204025.46634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.46713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.48561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204025.48618: stderr chunk (state=3): >>><<< 25675 1727204025.48623: stdout chunk (state=3): >>><<< 25675 1727204025.48645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204025.48771: handler run complete 25675 1727204025.48823: variable 'ansible_facts' from source: unknown 25675 1727204025.48975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.49322: variable 'ansible_facts' from source: unknown 25675 1727204025.49409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.49527: attempt loop complete, returning result 25675 1727204025.49530: _execute() done 25675 1727204025.49545: dumping result to json 25675 1727204025.49591: done dumping result, returning 25675 1727204025.49608: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-41bd-b19d-00000000056d] 25675 1727204025.49610: sending task result for task 028d2410-947f-41bd-b19d-00000000056d 25675 1727204025.50145: done sending task result for task 028d2410-947f-41bd-b19d-00000000056d 25675 1727204025.50148: WORKER PROCESS EXITING ok: [managed-node2] 25675 1727204025.50545: no more pending results, returning what we have 25675 1727204025.50547: results queue empty 25675 1727204025.50548: checking for any_errors_fatal 25675 1727204025.50549: done checking for any_errors_fatal 25675 1727204025.50549: checking for max_fail_percentage 25675 1727204025.50550: done checking for max_fail_percentage 25675 1727204025.50551: checking to see if all hosts have failed and the running result is not ok 25675 1727204025.50552: done checking to see if all hosts have failed 25675 1727204025.50552: getting the remaining hosts for this loop 25675 1727204025.50553: done getting the remaining hosts for this loop 25675 1727204025.50555: getting the next task for host managed-node2 25675 1727204025.50559: done getting next task for host managed-node2 25675 1727204025.50560: ^ task is: TASK: meta (flush_handlers) 25675 1727204025.50562: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204025.50565: getting variables 25675 1727204025.50566: in VariableManager get_vars() 25675 1727204025.50587: Calling all_inventory to load vars for managed-node2 25675 1727204025.50589: Calling groups_inventory to load vars for managed-node2 25675 1727204025.50591: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204025.50599: Calling all_plugins_play to load vars for managed-node2 25675 1727204025.50601: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204025.50603: Calling groups_plugins_play to load vars for managed-node2 25675 1727204025.55264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.56365: done with get_vars() 25675 1727204025.56401: done getting variables 25675 1727204025.56451: in VariableManager get_vars() 25675 1727204025.56458: Calling all_inventory to load vars for managed-node2 25675 1727204025.56460: Calling groups_inventory to load vars for managed-node2 25675 1727204025.56462: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204025.56465: Calling all_plugins_play to load vars for managed-node2 25675 1727204025.56466: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204025.56468: Calling groups_plugins_play to load vars for managed-node2 25675 1727204025.57411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.58730: done with get_vars() 25675 1727204025.58747: done queuing things up, now waiting for results queue to drain 25675 1727204025.58749: results queue empty 25675 1727204025.58750: checking for any_errors_fatal 25675 1727204025.58755: done checking for any_errors_fatal 25675 1727204025.58756: checking for max_fail_percentage 25675 1727204025.58757: done checking for max_fail_percentage 25675 1727204025.58757: checking to see if all hosts have failed and the running result is not ok 25675 1727204025.58763: done checking to see if all hosts have failed 25675 1727204025.58764: getting the remaining hosts for this loop 25675 1727204025.58765: done getting the remaining hosts for this loop 25675 1727204025.58767: getting the next task for host managed-node2 25675 1727204025.58771: done getting next task for host managed-node2 25675 1727204025.58774: ^ task is: TASK: Verify network state restored to default 25675 1727204025.58782: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204025.58787: getting variables 25675 1727204025.58788: in VariableManager get_vars() 25675 1727204025.58798: Calling all_inventory to load vars for managed-node2 25675 1727204025.58800: Calling groups_inventory to load vars for managed-node2 25675 1727204025.58803: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204025.58812: Calling all_plugins_play to load vars for managed-node2 25675 1727204025.58818: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204025.58822: Calling groups_plugins_play to load vars for managed-node2 25675 1727204025.60133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.61761: done with get_vars() 25675 1727204025.61985: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Tuesday 24 September 2024 14:53:45 -0400 (0:00:00.968) 0:00:45.071 ***** 25675 1727204025.62082: entering _queue_task() for managed-node2/include_tasks 25675 1727204025.62483: worker is 1 (out of 1 available) 25675 1727204025.62495: exiting _queue_task() for managed-node2/include_tasks 25675 1727204025.62507: done queuing things up, now waiting for results queue to drain 25675 1727204025.62508: waiting for pending results... 25675 1727204025.63098: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 25675 1727204025.63104: in run() - task 028d2410-947f-41bd-b19d-000000000078 25675 1727204025.63108: variable 'ansible_search_path' from source: unknown 25675 1727204025.63111: calling self._execute() 25675 1727204025.63140: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204025.63154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204025.63177: variable 'omit' from source: magic vars 25675 1727204025.63633: variable 'ansible_distribution_major_version' from source: facts 25675 1727204025.63649: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204025.63686: _execute() done 25675 1727204025.63691: dumping result to json 25675 1727204025.63693: done dumping result, returning 25675 1727204025.63695: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [028d2410-947f-41bd-b19d-000000000078] 25675 1727204025.63740: sending task result for task 028d2410-947f-41bd-b19d-000000000078 25675 1727204025.63874: no more pending results, returning what we have 25675 1727204025.63887: in VariableManager get_vars() 25675 1727204025.63937: Calling all_inventory to load vars for managed-node2 25675 1727204025.63941: Calling groups_inventory to load vars for managed-node2 25675 1727204025.63949: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204025.63966: Calling all_plugins_play to load vars for managed-node2 25675 1727204025.63971: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204025.63975: Calling groups_plugins_play to load vars for managed-node2 25675 1727204025.64855: done sending task result for task 028d2410-947f-41bd-b19d-000000000078 25675 1727204025.64861: WORKER PROCESS EXITING 25675 1727204025.66042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.67148: done with get_vars() 25675 1727204025.67164: variable 'ansible_search_path' from source: unknown 25675 1727204025.67181: we have included files to process 25675 1727204025.67182: generating all_blocks data 25675 1727204025.67183: done generating all_blocks data 25675 1727204025.67183: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25675 1727204025.67184: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25675 1727204025.67186: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25675 1727204025.67550: done processing included file 25675 1727204025.67551: iterating over new_blocks loaded from include file 25675 1727204025.67552: in VariableManager get_vars() 25675 1727204025.67560: done with get_vars() 25675 1727204025.67561: filtering new block on tags 25675 1727204025.67572: done filtering new block on tags 25675 1727204025.67573: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 25675 1727204025.67580: extending task lists for all hosts with included blocks 25675 1727204025.67609: done extending task lists 25675 1727204025.67610: done processing included files 25675 1727204025.67611: results queue empty 25675 1727204025.67611: checking for any_errors_fatal 25675 1727204025.67612: done checking for any_errors_fatal 25675 1727204025.67613: checking for max_fail_percentage 25675 1727204025.67613: done checking for max_fail_percentage 25675 1727204025.67614: checking to see if all hosts have failed and the running result is not ok 25675 1727204025.67615: done checking to see if all hosts have failed 25675 1727204025.67615: getting the remaining hosts for this loop 25675 1727204025.67616: done getting the remaining hosts for this loop 25675 1727204025.67617: getting the next task for host managed-node2 25675 1727204025.67620: done getting next task for host managed-node2 25675 1727204025.67622: ^ task is: TASK: Check routes and DNS 25675 1727204025.67624: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204025.67625: getting variables 25675 1727204025.67626: in VariableManager get_vars() 25675 1727204025.67633: Calling all_inventory to load vars for managed-node2 25675 1727204025.67634: Calling groups_inventory to load vars for managed-node2 25675 1727204025.67635: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204025.67639: Calling all_plugins_play to load vars for managed-node2 25675 1727204025.67641: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204025.67642: Calling groups_plugins_play to load vars for managed-node2 25675 1727204025.68492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204025.69561: done with get_vars() 25675 1727204025.69587: done getting variables 25675 1727204025.69627: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:53:45 -0400 (0:00:00.075) 0:00:45.147 ***** 25675 1727204025.69646: entering _queue_task() for managed-node2/shell 25675 1727204025.69999: worker is 1 (out of 1 available) 25675 1727204025.70011: exiting _queue_task() for managed-node2/shell 25675 1727204025.70022: done queuing things up, now waiting for results queue to drain 25675 1727204025.70024: waiting for pending results... 25675 1727204025.70264: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 25675 1727204025.70389: in run() - task 028d2410-947f-41bd-b19d-00000000057e 25675 1727204025.70400: variable 'ansible_search_path' from source: unknown 25675 1727204025.70404: variable 'ansible_search_path' from source: unknown 25675 1727204025.70429: calling self._execute() 25675 1727204025.70502: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204025.70507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204025.70516: variable 'omit' from source: magic vars 25675 1727204025.70846: variable 'ansible_distribution_major_version' from source: facts 25675 1727204025.70853: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204025.70856: variable 'omit' from source: magic vars 25675 1727204025.70906: variable 'omit' from source: magic vars 25675 1727204025.70944: variable 'omit' from source: magic vars 25675 1727204025.70994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204025.71022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204025.71038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204025.71057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204025.71092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204025.71104: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204025.71107: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204025.71110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204025.71211: Set connection var ansible_shell_type to sh 25675 1727204025.71214: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204025.71220: Set connection var ansible_timeout to 10 25675 1727204025.71225: Set connection var ansible_pipelining to False 25675 1727204025.71230: Set connection var ansible_shell_executable to /bin/sh 25675 1727204025.71232: Set connection var ansible_connection to ssh 25675 1727204025.71260: variable 'ansible_shell_executable' from source: unknown 25675 1727204025.71277: variable 'ansible_connection' from source: unknown 25675 1727204025.71281: variable 'ansible_module_compression' from source: unknown 25675 1727204025.71285: variable 'ansible_shell_type' from source: unknown 25675 1727204025.71287: variable 'ansible_shell_executable' from source: unknown 25675 1727204025.71289: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204025.71291: variable 'ansible_pipelining' from source: unknown 25675 1727204025.71294: variable 'ansible_timeout' from source: unknown 25675 1727204025.71295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204025.71414: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204025.71424: variable 'omit' from source: magic vars 25675 1727204025.71437: starting attempt loop 25675 1727204025.71441: running the handler 25675 1727204025.71444: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204025.71467: _low_level_execute_command(): starting 25675 1727204025.71477: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204025.72065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204025.72070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.72119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204025.72141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.72221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.73925: stdout chunk (state=3): >>>/root <<< 25675 1727204025.74019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204025.74111: stderr chunk (state=3): >>><<< 25675 1727204025.74114: stdout chunk (state=3): >>><<< 25675 1727204025.74134: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204025.74159: _low_level_execute_command(): starting 25675 1727204025.74240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347 `" && echo ansible-tmp-1727204025.7414584-29320-264379304614347="` echo /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347 `" ) && sleep 0' 25675 1727204025.75245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204025.75340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204025.75362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204025.75406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204025.75445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 25675 1727204025.75457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.75506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204025.75520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204025.75535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.75610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.77605: stdout chunk (state=3): >>>ansible-tmp-1727204025.7414584-29320-264379304614347=/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347 <<< 25675 1727204025.77653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204025.77885: stdout chunk (state=3): >>><<< 25675 1727204025.77889: stderr chunk (state=3): >>><<< 25675 1727204025.77892: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204025.7414584-29320-264379304614347=/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204025.77894: variable 'ansible_module_compression' from source: unknown 25675 1727204025.77896: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727204025.77898: variable 'ansible_facts' from source: unknown 25675 1727204025.77980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py 25675 1727204025.78228: Sending initial data 25675 1727204025.78241: Sent initial data (156 bytes) 25675 1727204025.78856: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.78919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204025.78935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204025.78960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.79066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.80913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204025.80978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204025.81050: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmpj297df1j /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py <<< 25675 1727204025.81060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py" <<< 25675 1727204025.81122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmpj297df1j" to remote "/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py" <<< 25675 1727204025.82785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204025.82789: stdout chunk (state=3): >>><<< 25675 1727204025.82792: stderr chunk (state=3): >>><<< 25675 1727204025.82794: done transferring module to remote 25675 1727204025.82796: _low_level_execute_command(): starting 25675 1727204025.82799: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/ /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py && sleep 0' 25675 1727204025.84161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.84214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204025.84274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204025.84297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.84399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204025.86335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204025.86455: stderr chunk (state=3): >>><<< 25675 1727204025.86459: stdout chunk (state=3): >>><<< 25675 1727204025.86554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204025.86557: _low_level_execute_command(): starting 25675 1727204025.86560: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/AnsiballZ_command.py && sleep 0' 25675 1727204025.87396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204025.87419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204025.87434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204025.87455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204025.87562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.04066: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3283sec preferred_lft 3283sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:53:46.030241", "end": "2024-09-24 14:53:46.039039", "delta": "0:00:00.008798", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727204026.05637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.05650: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 25675 1727204026.05774: stderr chunk (state=3): >>><<< 25675 1727204026.05783: stdout chunk (state=3): >>><<< 25675 1727204026.05787: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3283sec preferred_lft 3283sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:53:46.030241", "end": "2024-09-24 14:53:46.039039", "delta": "0:00:00.008798", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204026.05853: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204026.05866: _low_level_execute_command(): starting 25675 1727204026.05888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204025.7414584-29320-264379304614347/ > /dev/null 2>&1 && sleep 0' 25675 1727204026.07152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204026.07180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204026.07203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.07322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.09260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.09363: stderr chunk (state=3): >>><<< 25675 1727204026.09401: stdout chunk (state=3): >>><<< 25675 1727204026.09425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204026.09437: handler run complete 25675 1727204026.09499: Evaluated conditional (False): False 25675 1727204026.09537: attempt loop complete, returning result 25675 1727204026.09545: _execute() done 25675 1727204026.09561: dumping result to json 25675 1727204026.09788: done dumping result, returning 25675 1727204026.09792: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [028d2410-947f-41bd-b19d-00000000057e] 25675 1727204026.09794: sending task result for task 028d2410-947f-41bd-b19d-00000000057e 25675 1727204026.09867: done sending task result for task 028d2410-947f-41bd-b19d-00000000057e 25675 1727204026.09871: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008798", "end": "2024-09-24 14:53:46.039039", "rc": 0, "start": "2024-09-24 14:53:46.030241" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3283sec preferred_lft 3283sec inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 25675 1727204026.09958: no more pending results, returning what we have 25675 1727204026.09962: results queue empty 25675 1727204026.09963: checking for any_errors_fatal 25675 1727204026.09965: done checking for any_errors_fatal 25675 1727204026.09965: checking for max_fail_percentage 25675 1727204026.09968: done checking for max_fail_percentage 25675 1727204026.09969: checking to see if all hosts have failed and the running result is not ok 25675 1727204026.09970: done checking to see if all hosts have failed 25675 1727204026.09970: getting the remaining hosts for this loop 25675 1727204026.09972: done getting the remaining hosts for this loop 25675 1727204026.09978: getting the next task for host managed-node2 25675 1727204026.09986: done getting next task for host managed-node2 25675 1727204026.09989: ^ task is: TASK: Verify DNS and network connectivity 25675 1727204026.09992: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204026.09996: getting variables 25675 1727204026.09998: in VariableManager get_vars() 25675 1727204026.10027: Calling all_inventory to load vars for managed-node2 25675 1727204026.10030: Calling groups_inventory to load vars for managed-node2 25675 1727204026.10034: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204026.10050: Calling all_plugins_play to load vars for managed-node2 25675 1727204026.10054: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204026.10058: Calling groups_plugins_play to load vars for managed-node2 25675 1727204026.12622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204026.13521: done with get_vars() 25675 1727204026.13553: done getting variables 25675 1727204026.13617: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.439) 0:00:45.587 ***** 25675 1727204026.13647: entering _queue_task() for managed-node2/shell 25675 1727204026.14010: worker is 1 (out of 1 available) 25675 1727204026.14021: exiting _queue_task() for managed-node2/shell 25675 1727204026.14033: done queuing things up, now waiting for results queue to drain 25675 1727204026.14034: waiting for pending results... 25675 1727204026.14402: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 25675 1727204026.14425: in run() - task 028d2410-947f-41bd-b19d-00000000057f 25675 1727204026.14447: variable 'ansible_search_path' from source: unknown 25675 1727204026.14456: variable 'ansible_search_path' from source: unknown 25675 1727204026.14503: calling self._execute() 25675 1727204026.14640: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204026.14724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204026.14758: variable 'omit' from source: magic vars 25675 1727204026.15195: variable 'ansible_distribution_major_version' from source: facts 25675 1727204026.15261: Evaluated conditional (ansible_distribution_major_version != '6'): True 25675 1727204026.15364: variable 'ansible_facts' from source: unknown 25675 1727204026.16012: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 25675 1727204026.16017: variable 'omit' from source: magic vars 25675 1727204026.16051: variable 'omit' from source: magic vars 25675 1727204026.16083: variable 'omit' from source: magic vars 25675 1727204026.16115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25675 1727204026.16141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25675 1727204026.16160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25675 1727204026.16183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204026.16190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25675 1727204026.16214: variable 'inventory_hostname' from source: host vars for 'managed-node2' 25675 1727204026.16217: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204026.16220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204026.16291: Set connection var ansible_shell_type to sh 25675 1727204026.16297: Set connection var ansible_module_compression to ZIP_DEFLATED 25675 1727204026.16303: Set connection var ansible_timeout to 10 25675 1727204026.16307: Set connection var ansible_pipelining to False 25675 1727204026.16313: Set connection var ansible_shell_executable to /bin/sh 25675 1727204026.16316: Set connection var ansible_connection to ssh 25675 1727204026.16336: variable 'ansible_shell_executable' from source: unknown 25675 1727204026.16339: variable 'ansible_connection' from source: unknown 25675 1727204026.16341: variable 'ansible_module_compression' from source: unknown 25675 1727204026.16344: variable 'ansible_shell_type' from source: unknown 25675 1727204026.16346: variable 'ansible_shell_executable' from source: unknown 25675 1727204026.16348: variable 'ansible_host' from source: host vars for 'managed-node2' 25675 1727204026.16351: variable 'ansible_pipelining' from source: unknown 25675 1727204026.16354: variable 'ansible_timeout' from source: unknown 25675 1727204026.16358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 25675 1727204026.16461: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204026.16470: variable 'omit' from source: magic vars 25675 1727204026.16477: starting attempt loop 25675 1727204026.16482: running the handler 25675 1727204026.16490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25675 1727204026.16509: _low_level_execute_command(): starting 25675 1727204026.16517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25675 1727204026.17183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204026.17188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.17192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 25675 1727204026.17194: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.17265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.17319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.19049: stdout chunk (state=3): >>>/root <<< 25675 1727204026.19156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.19191: stderr chunk (state=3): >>><<< 25675 1727204026.19195: stdout chunk (state=3): >>><<< 25675 1727204026.19216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204026.19229: _low_level_execute_command(): starting 25675 1727204026.19239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834 `" && echo ansible-tmp-1727204026.1921668-29343-136492542561834="` echo /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834 `" ) && sleep 0' 25675 1727204026.20053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204026.20111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204026.20149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 25675 1727204026.20157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204026.20160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.20415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204026.20456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.20593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.22600: stdout chunk (state=3): >>>ansible-tmp-1727204026.1921668-29343-136492542561834=/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834 <<< 25675 1727204026.22728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.22749: stderr chunk (state=3): >>><<< 25675 1727204026.22752: stdout chunk (state=3): >>><<< 25675 1727204026.22763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204026.1921668-29343-136492542561834=/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204026.22791: variable 'ansible_module_compression' from source: unknown 25675 1727204026.22867: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25675almbh8x_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25675 1727204026.22883: variable 'ansible_facts' from source: unknown 25675 1727204026.22932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py 25675 1727204026.23066: Sending initial data 25675 1727204026.23069: Sent initial data (156 bytes) 25675 1727204026.23695: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.23698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204026.23700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.23742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204026.23798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.23865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.25502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25675 1727204026.25505: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25675 1727204026.25570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25675 1727204026.25640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25675almbh8x_/tmps3yznhg4 /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py <<< 25675 1727204026.25648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py" <<< 25675 1727204026.25710: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25675almbh8x_/tmps3yznhg4" to remote "/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py" <<< 25675 1727204026.26362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.26401: stderr chunk (state=3): >>><<< 25675 1727204026.26405: stdout chunk (state=3): >>><<< 25675 1727204026.26431: done transferring module to remote 25675 1727204026.26441: _low_level_execute_command(): starting 25675 1727204026.26445: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/ /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py && sleep 0' 25675 1727204026.27088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.27095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25675 1727204026.27097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.27164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204026.27184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.27285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.29068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.29212: stderr chunk (state=3): >>><<< 25675 1727204026.29216: stdout chunk (state=3): >>><<< 25675 1727204026.29219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204026.29221: _low_level_execute_command(): starting 25675 1727204026.29223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/AnsiballZ_command.py && sleep 0' 25675 1727204026.29680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204026.29687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25675 1727204026.29701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.29723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25675 1727204026.29772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 25675 1727204026.29797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.29869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.84756: stdout chunk (state=3): >>> <<< 25675 1727204026.84774: stdout chunk (state=3): >>>{"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6715 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1362 0 --:--:-- --:--:-- --:--:-- 1366", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:53:46.450377", "end": "2024-09-24 14:53:46.846341", "delta": "0:00:00.395964", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25675 1727204026.86486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 25675 1727204026.86490: stdout chunk (state=3): >>><<< 25675 1727204026.86492: stderr chunk (state=3): >>><<< 25675 1727204026.86645: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6715 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1362 0 --:--:-- --:--:-- --:--:-- 1366", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:53:46.450377", "end": "2024-09-24 14:53:46.846341", "delta": "0:00:00.395964", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 25675 1727204026.86655: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25675 1727204026.86658: _low_level_execute_command(): starting 25675 1727204026.86660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204026.1921668-29343-136492542561834/ > /dev/null 2>&1 && sleep 0' 25675 1727204026.87271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25675 1727204026.87290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204026.87334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 25675 1727204026.87430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25675 1727204026.87449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 25675 1727204026.87472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25675 1727204026.87590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25675 1727204026.89500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25675 1727204026.89520: stderr chunk (state=3): >>><<< 25675 1727204026.89523: stdout chunk (state=3): >>><<< 25675 1727204026.89537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25675 1727204026.89547: handler run complete 25675 1727204026.89564: Evaluated conditional (False): False 25675 1727204026.89572: attempt loop complete, returning result 25675 1727204026.89574: _execute() done 25675 1727204026.89581: dumping result to json 25675 1727204026.89586: done dumping result, returning 25675 1727204026.89594: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [028d2410-947f-41bd-b19d-00000000057f] 25675 1727204026.89599: sending task result for task 028d2410-947f-41bd-b19d-00000000057f 25675 1727204026.89710: done sending task result for task 028d2410-947f-41bd-b19d-00000000057f 25675 1727204026.89713: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.395964", "end": "2024-09-24 14:53:46.846341", "rc": 0, "start": "2024-09-24 14:53:46.450377" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6715 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1362 0 --:--:-- --:--:-- --:--:-- 1366 25675 1727204026.89781: no more pending results, returning what we have 25675 1727204026.89784: results queue empty 25675 1727204026.89785: checking for any_errors_fatal 25675 1727204026.89793: done checking for any_errors_fatal 25675 1727204026.89793: checking for max_fail_percentage 25675 1727204026.89795: done checking for max_fail_percentage 25675 1727204026.89796: checking to see if all hosts have failed and the running result is not ok 25675 1727204026.89796: done checking to see if all hosts have failed 25675 1727204026.89797: getting the remaining hosts for this loop 25675 1727204026.89798: done getting the remaining hosts for this loop 25675 1727204026.89802: getting the next task for host managed-node2 25675 1727204026.89814: done getting next task for host managed-node2 25675 1727204026.89815: ^ task is: TASK: meta (flush_handlers) 25675 1727204026.89817: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204026.89823: getting variables 25675 1727204026.89824: in VariableManager get_vars() 25675 1727204026.89852: Calling all_inventory to load vars for managed-node2 25675 1727204026.89854: Calling groups_inventory to load vars for managed-node2 25675 1727204026.89858: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204026.89868: Calling all_plugins_play to load vars for managed-node2 25675 1727204026.89871: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204026.89874: Calling groups_plugins_play to load vars for managed-node2 25675 1727204026.90958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204026.92716: done with get_vars() 25675 1727204026.92739: done getting variables 25675 1727204026.92816: in VariableManager get_vars() 25675 1727204026.92826: Calling all_inventory to load vars for managed-node2 25675 1727204026.92828: Calling groups_inventory to load vars for managed-node2 25675 1727204026.92831: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204026.92835: Calling all_plugins_play to load vars for managed-node2 25675 1727204026.92838: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204026.92841: Calling groups_plugins_play to load vars for managed-node2 25675 1727204026.94367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204026.96223: done with get_vars() 25675 1727204026.96368: done queuing things up, now waiting for results queue to drain 25675 1727204026.96371: results queue empty 25675 1727204026.96372: checking for any_errors_fatal 25675 1727204026.96377: done checking for any_errors_fatal 25675 1727204026.96381: checking for max_fail_percentage 25675 1727204026.96382: done checking for max_fail_percentage 25675 1727204026.96383: checking to see if all hosts have failed and the running result is not ok 25675 1727204026.96383: done checking to see if all hosts have failed 25675 1727204026.96384: getting the remaining hosts for this loop 25675 1727204026.96385: done getting the remaining hosts for this loop 25675 1727204026.96388: getting the next task for host managed-node2 25675 1727204026.96392: done getting next task for host managed-node2 25675 1727204026.96394: ^ task is: TASK: meta (flush_handlers) 25675 1727204026.96395: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204026.96398: getting variables 25675 1727204026.96399: in VariableManager get_vars() 25675 1727204026.96408: Calling all_inventory to load vars for managed-node2 25675 1727204026.96410: Calling groups_inventory to load vars for managed-node2 25675 1727204026.96412: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204026.96418: Calling all_plugins_play to load vars for managed-node2 25675 1727204026.96420: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204026.96423: Calling groups_plugins_play to load vars for managed-node2 25675 1727204026.98120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204026.99740: done with get_vars() 25675 1727204026.99764: done getting variables 25675 1727204026.99836: in VariableManager get_vars() 25675 1727204026.99846: Calling all_inventory to load vars for managed-node2 25675 1727204026.99849: Calling groups_inventory to load vars for managed-node2 25675 1727204026.99851: Calling all_plugins_inventory to load vars for managed-node2 25675 1727204026.99856: Calling all_plugins_play to load vars for managed-node2 25675 1727204026.99859: Calling groups_plugins_inventory to load vars for managed-node2 25675 1727204026.99867: Calling groups_plugins_play to load vars for managed-node2 25675 1727204027.01203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25675 1727204027.02973: done with get_vars() 25675 1727204027.03010: done queuing things up, now waiting for results queue to drain 25675 1727204027.03012: results queue empty 25675 1727204027.03013: checking for any_errors_fatal 25675 1727204027.03014: done checking for any_errors_fatal 25675 1727204027.03015: checking for max_fail_percentage 25675 1727204027.03016: done checking for max_fail_percentage 25675 1727204027.03017: checking to see if all hosts have failed and the running result is not ok 25675 1727204027.03018: done checking to see if all hosts have failed 25675 1727204027.03018: getting the remaining hosts for this loop 25675 1727204027.03019: done getting the remaining hosts for this loop 25675 1727204027.03022: getting the next task for host managed-node2 25675 1727204027.03025: done getting next task for host managed-node2 25675 1727204027.03026: ^ task is: None 25675 1727204027.03028: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25675 1727204027.03029: done queuing things up, now waiting for results queue to drain 25675 1727204027.03030: results queue empty 25675 1727204027.03030: checking for any_errors_fatal 25675 1727204027.03031: done checking for any_errors_fatal 25675 1727204027.03031: checking for max_fail_percentage 25675 1727204027.03032: done checking for max_fail_percentage 25675 1727204027.03036: checking to see if all hosts have failed and the running result is not ok 25675 1727204027.03036: done checking to see if all hosts have failed 25675 1727204027.03038: getting the next task for host managed-node2 25675 1727204027.03040: done getting next task for host managed-node2 25675 1727204027.03041: ^ task is: None 25675 1727204027.03042: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Tuesday 24 September 2024 14:53:47 -0400 (0:00:00.894) 0:00:46.482 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.03s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.83s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.53s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.39s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.21s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 1.11s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Create veth interface lsr27 --------------------------------------------- 1.11s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Install iproute --------------------------------------------------------- 0.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Verify DNS and network connectivity ------------------------------------- 0.89s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 25675 1727204027.03156: RUNNING CLEANUP